There is more deep investigation about potential cardinality errors in CBO.
At this place I want to thanks Jonathan Lewis for this brilliant book about Cost Based Optimization
in Oracle. That was and still this is a base of all my investigations.
I have got a query (described in part 1) which cause me a lot of problems with temporary tablespace.
I have used a 10053 event to generate details about CBO decisions for that particular query. Here I want to show only a part of that 5MB text file. According to that file join number 295 has been chosen as best join order.
...
Final  All Rows Plan: Best join order: 295
...
Here is a part of 10056 file
Join order[295]: TAB1_MV[P]#0 TAB2_MV[E]#1 TAB3_MV[S]#13 TAB4_MV[H]#11 TAB5_MV[D]#9 TAB6_MV[AC]#12 TAB7_MV[SR]#7 SP_MV[R]#19 ... rest of tables
***************
Now joining: SP_MV[R]#19
***************
....
Join Card: 0.00 = outer (11.29) * inner (30335.98) * sel (1.8863e09)
Join Card  Rounded: 1 Computed: 0.00
There is no information about joining previous tables from TAB1 to TAB7. But join cardinality for those 7 tables has been calculated to 11.29. I want to know how it has been calculated. CBO can reuse calculations from other join orders so I have to find where it was.
Here is some information about tables. This is Single Table Cardinality which is used in join calculations.
TAB1_MV[P]#0 Card: Original: 1 Rounded: 1 Computed: 1.00 Non Adjusted: 1.00
TAB2_MV[E]#1 Card: Original: 3 Rounded: 3 Computed: 3.00 Non Adjusted: 3.00
TAB3_MV[S]#13 Card: Original: 21645 Rounded: 21645 Computed: 21645.00 Non Adjusted: 21645.00
TAB4_MV[H]#11 Card: Original: 550 Rounded: 550 Computed: 550.00 Non Adjusted: 550.00
TAB5_MV[D]#9 Card: Original: 117 Rounded: 117 Computed: 117.00 Non Adjusted: 117.00
TAB6_MV[AC]#12 Card: Original: 199128 Rounded: 2797 Computed: 2796.88 Non Adjusted: 2796.88
TAB7_MV[SR]#7 Card: Original: 23 Rounded: 23 Computed: 23.00 Non Adjusted: 23.00
As you can see it can be different between num_rows and computed cardinality. So CBO calculate a number of rows. There are detail table stats
Table Stats::
Table: TAB6_MV Alias: AC
#Rows: 199128 #Blks: 2688 AvgRowLen: 171.00
Column (#16): TAB6_CR_ID
AvgLen: 2.00 NDV: 6262 Nulls: 151566 Density: 8.4931e05 Min: 1000 Max: 70516
Histogram: HtBal #Bkts: 254 UncompBkts: 254 EndPtVals: 140
Column (#17): TAB6_CR_NUMBER
AvgLen: 2.00 NDV: 5 Nulls: 150840 Density: 7.1446e04 Min: 1 Max: 33
Histogram: Freq #Bkts: 5 UncompBkts: 48288 EndPtVals: 5
Column (#2): TAB6_ID
AvgLen: 5.00 NDV: 14584 Nulls: 0 Density: 6.1554e05 Min: 607 Max: 70943
Histogram: HtBal #Bkts: 254 UncompBkts: 254 EndPtVals: 230
Column (#25): TAB_CR_SRCE_CD
AvgLen: 2.00 NDV: 2 Nulls: 150840 Density: 0.0016153
Histogram: Freq #Bkts: 2 UncompBkts: 48288 EndPtVals: 2
There are NULL values in 3 of 4 columns and because in join condition Oracle doesn’t use null values it is trying to adjust a number of rows using those values.
For only one column with null values calculated table cardinality is equal to
Cal cardinality = num_rows – number of nulls
# of rows # of nulls # of not null # not null/ # of rows
199128 151566 47562 0.238851
199128 150840 48288 0.242497
199128 0 199128 1
199128 150840 48288 0.242497
If there is more than one column with null values some more calculations are in place.
Cal cardinality col 1 and 2 = Cal cardinality col 1 * (number of not null / num_rows) for col 2
And so on
Cal cardinality col 1 and 2 and 3 = Cal cardinality col 1 and 2 * (number of not null / num_rows) for col 2
So in our example
Cal card = 47562 * 0.242497 * 1 *0.242497 = 2796.88
Did you read a part number 1 ? Above calculation are based on number of rows and number of nulls and there are no information about data values. Neither number of distinct values nor information from histograms was used.
So let’s CBO do it work
Join order[294]: TAB1_MV[P]#0 TAB2_MV[E]#1 TAB3_MV[S]#13 TAB4_MV[H]#11 TAB5_MV[D]#9 TAB6_MV[AC]#12 TAB7_MV[SR]#7 ...
***************
Now joining: TAB2_MV[E]#1
***************
...
Join Card: 1.00 = outer (1.00) * inner (3.00) * sel (0.33333)
Join Card  Rounded: 1 Computed: 1.00
...
***************
Now joining: TAB3_MV[S]#13
***************
...
Join Card: 373.19 = outer (1.00) * inner (21645.00) * sel (0.017241)
Join Card  Rounded: 373 Computed: 373.19
...
***************
Now joining: TAB4_MV[H]#11
***************
...
Join Card: 126.39 = outer (373.19) * inner (550.00) * sel (6.1576e04)
Join Card  Rounded: 126 Computed: 126.39
...
***************
Now joining: TAB5_MV[D]#9
***************
...
Join Card: 126.39 = outer (126.39) * inner (117.00) * sel (0.008547)
Join Card  Rounded: 126 Computed: 126.39
...
***************
Now joining: TAB6_MV[AC]#12
***************
...
Join Card: 11.29 = outer (126.39) * inner (2796.88) * sel (3.1939e05)
Join Card  Rounded: 11 Computed: 11.29
...
***************
Now joining: TAB7_MV[SR]#7
***************
...
Join Card: 11.29 = outer (11.29) * inner (23.00) * sel (0.043478)
Join Card  Rounded: 11 Computed: 11.29
...
As you can see calculated output cardinality is 11.29.
I have made some small experiment and I cut out from SQL query part related only to those 7 tables.
select count(*) FROM TAB4_MV H, TAB3_MV S,TAB1_MV P,TAB2_MV E,TAB5_MV D,TAB6_MV AC,TAB7_MV SR
WHERE S.TAB3_TC = H.TAB4_TC AND S.TAB3_CON_ID = H.TAB4_ID
AND S.TAB3_CON_NUMBER = H.TAB4_CON_NUMBER
AND H.TAB4_DETAIL_PAY_CODE = D.TAB5_DETAIL_CODE
AND S.TAB3_CON_ID = AC.TAB6_CR_ID
AND S.TAB3_CON_NUMBER = AC.TAB6_CR_NUMBER
AND AC.TAB6_CR_SRCE_CODE = SR.TAB7_CODE
AND S.TAB3_CON_ID = P.TAB1_ID
AND P.TAB1_ECAT_CODE = E.TAB2_CODE;
Here is a result of that query
COUNT(*)

141 354 138
Comparing that to estimated 11.92 rows you can see that real join cardinality is almost 10 mln times more that calculated.
Ok so let’s execute a full SQL query. Best join order is set to 295  see calculation at the begining of that post. As you can see cardinality is set to 1 now but result of query
select count(*) FROM SP_MV R, TAB4_MV H,TAB3_MV S,TAB1_MV P,TAB2_MV E,TAB5_MV D,
TAB6_MV AC,TAB7_MV SR
WHERE R.SP_ENTITY_IND = 'P'
AND R.SP_CHANGE_IND IS NULL
AND R.SID = S.TAB3_STU_ID
AND S.TAB3_TC = H.TAB4_TC
AND S.TAB3_CON_ID = H.TAB4_ID
AND S.TAB3_CON_NUMBER = H.TAB4_CON_NUMBER
AND H.TAB4_DETAIL_PAY_CODE = D.TAB5_DETAIL_CODE
AND S.TAB3_CON_ID = AC.TAB6_CR_ID
AND S.TAB3_CON_NUMBER = AC.TAB6_CR_NUMBER
AND R.SID = AC.TAB6_ID
AND AC.TAB6_CR_SRCE_CODE = SR.TAB7_CODE
AND S.TAB3_CON_ID = P.TAB1_ID
AND P.TAB1_ECAT_CODE = E.TAB2_CODE;
is
ORA01652: unable to extend temp segment by 64 in tablespace TEMP
This is not what a customer is expecting as results of report. I find out that when cardinality of table is set via dbms_stats.set_table_stats it is not changed in Single Table Cardinality Section. I have to investigate this more deeply why but for now this is my solution for that problem.
See you in part 3.
regards,
Marcin
Efficient Statistics Maintenance for Partitioned Tables Using Incremental Statistics – Part 3

Introduction
This post covers how Oracle has improved incremental statistics for
partitioned tables in Oracle Database 12c Release 2. If you’re not alre...
2 months ago
0 comments:
Post a Comment