I felt a need to write about this important topic after experiencing that often times a fraud is detected by chance during the audit either by external or the internal auditors. The new auditors in profession and the auditors under training may skip the fraudulent transactions due to less experience and sometimes lack of analytical audit procedures. This article has been focused on two points i.e. fraud itself and its detection techniques. I believe once these techniques are properly understood by the auditors they will be mastering in identifying the risky areas and to implement the fraud detection plan during the audit of a financial statement as a whole or the group of transaction thereon.
Before proceeding towards the fraud detection we need to understand that what the fraud actually is, the Association of Certified Fraud Examiners defines fraud as "deception or misrepresentation that an individual or entity makes knowing that the misrepresentation could result in some unauthorized benefit to the individual or to the entity or some other party." Whereas in law it has been defined as an intentional concealment of facts, more broadly if an error occurs and someone intentionally make arrangements to hide such error due to whatever the reason will also be regarded as fraud.
The fraud may occur at entity level or the individual level. If the fraud has been committed by an individual, obviously, it has been done to get undue benefit. Whereas if the fraud has been committed at entity level, there could be various reasons including deceiving the investors, legislators, creditors and so on.
The COSO fraud report for the year 2010 states there are several motivational forces encourage the companies to commit fraud. These forces include but not limited to meet the internal and external earning expectations, hiding the deteriorated financial position of the company, raising the price of its shares in stock market, a need to bolster the financial performance for pending equity and debt financing, the desire to increase the management’s compensation based on the financial performance and good results.
Normally the companies manipulate the financial statements through the following:
1. Fictitious sales
2. Improper expenses recognition
3. Incorrect asset valuation
4. Hidden liabilities, and
5. Unsuitable disclosures.
The auditor can spot the red flag on risky area by putting an analytical procedure e.g.:
· Revenue is growing without growth in cash flows,
· Consistent sales growth whereas the competitors are experiencing weak performance,
· A rapid and unexplainable rise in the number of day's sales in receivables in addition to growing inventories.
a. A significant surge in the company's performance within the final reporting period of fiscal year.
b. The company maintains consistent gross profit margins while its industry is facing pricing pressure.
c. A large buildup of fixed assets. It may refer to operating expense capitalization, rather than expense recognition.
d. Depreciation methods and estimates of assets' useful life. An overstated life of an asset will decrease the annual depreciation expense.
e. A weak system of internal controls.
f. Outsized frequency of complex relatedparty or thirdparty transactions, many of which do not add tangible value (can be used to conceal debt off the balance sheet).
Normally it is considered that the detection and prevention of the fraud is the responsibility of the management. It can be regarded true but only in case where company intends to avoid the fraud at individual level. However, the recent audit failures especially after the mega fraud of M/s Enron the theory of fraud detection and prevention have been changed. Now the auditors are required to design the procedures and use the modern mathematical techniques to identify the risky areas for better audit performance and to avoid any unwanted episode of audited results.
How to detect potential fraud:
After having understood the above details one should intend to know the fraud detection techniques. Based on my working experience I conclude there are two approaches used by auditors to put the red flags on the risky area in order to identify the potential fraud.
1. Classical approach
2. Modern/ mathematical approach.
Classical approach:
Most of the auditors use classical approach to identify the risky areas, this approach includes:
· Horizontal analysis,
· Vertical analysis,
· Ratio analysis, and
· Substantive procedures.
Horizontal and vertical financial statement analysis introduces a straightforward approach to fraud detection. Vertical analysis involves taking every item in the income statement as a percentage of revenue and comparing the yearoveryear trends that could be a potential flag cause of concern. A similar approach can also be applied to the balance sheet, using total assets as the comparison benchmark, to monitor significant deviations from normal activity. Horizontal analysis implements a similar approach whereby rather than having an account serve as the point of reference, financial information is represented as a percentage of the base years' figures. Likewise, unexplainable variations in percentage can serve as a red flag requiring further analysis and substantive procedures.
Comparative ratio analysis also allows analysts and auditors to spot discrepancies within the firm's financial statements. By analyzing ratios, information regarding day's sales in receivables, leverage multiples and other vital metrics can be determined and analyzed for inconsistencies.
Modern/ Mathematical approach:
With the improvement in technology we need to improve the audit procedures as well to achieve our objective in capacity of the auditor. I believe that the classical approach is traditionally used to analyze the highly aggregated data and can provide only a broader indication of the potential fraud. Whereas, the modern or mathematical approach during the audit of a class of transactions will give more precise results for further verification and application of substantive procedures.
These techniques include:
1. Digital analysis
2. Beneish model
1. Digital analysis:
The digital analysis also called the “Benford’s law” is basically the analysis of the frequency of digits in every transaction. The law has been evolved over the years and concluding the ideal probability in percentage of occurrence of each digit in every transaction.
As we know that amount of every transaction stars from one digit from 1,2,3,4,5,6,7,8,or 9. Now we need to conduct 1^{st} digit, 2^{nd} digit, 3^{rd} digit and 4^{th} digit analysis of the given data. During this analysis we establish the frequency of each digit and then compare it with the established probable result by the Benford’s law. Where the calculated frequency exceeds the established probability it should be red flagged as potential fraud and needs to be further verified in detail. e.g. in a given data of 7000 transactions we conduct a single digit or 1^{st} digit analysis and the following are the results:
Table A
Number of transactions (A) 
1^{st} digit of transactions (B) 
Percentage to total (C ) 
1 
1000 
14.28571 
2 
500 
7.142857 
3 
1100 
15.71429 
4 
900 
12.85714 
5 
600 
8.571429 
6 
300 
4.285714 
7 
100 
1.428571 
8 
500 
7.142857 
9 
2000 
28.57143 
Total 
7000 
100% 
Now compare the calculated results as per column C with the established probabilities under the Benford’s law and identify the group of transactions having potential fraud risk. Similarly the frequency of 2^{nd}, 3^{rd} and 4^{th} digit can be calculated and compared with the ideal probability for identification of the potential fraud in the group of transactions.
The following table B gives the standard probability of the 1^{st}, 2^{nd}, 3^{rd} and 4^{th} digit for the compassion with the actual calculated results.
Table B
Digit 
1^{st} place 
2^{nd} place 
3^{rd} place 
4^{th} place 
0 
11.968% 
10.178% 
10.018% 

1 
30.103% 
11.389% 
10.138% 
10.014% 
2 
17.609% 
19.882% 
10.097% 
10.010% 
3 
12.494% 
10.433% 
10.057% 
10.006% 
4 
9.691% 
10.031% 
10.018% 
10.002% 
5 
7.918% 
9.668% 
9.979% 
9.998% 
6 
6.695% 
9.337% 
9.940% 
9.994% 
7 
5.799% 
9.035% 
9.902% 
9.999% 
8 
5.115% 
8.757% 
9.864% 
9.986% 
9 
4.576% 
8.5% 
9.827% 
9.982% 
Now compare the values calculated in column C of the table A with the 1^{st} digit probability as mentioned in table B
Digit of transactions 
Actual frequency (A) 
Standard frequency (B) 
Variation AB 
1 
14.28571% 
30.103% 
15.82% 
2 
7.142857% 
17.609% 
10.47% 
3 
15.71429% 
12.494% 
3.22% 
4 
12.85714% 
9.691% 
3.17% 
5 
8.571429% 
7.918% 
0.65% 
6 
4.285714% 
6.695% 
2.41% 
7 
1.428571% 
5.799% 
4.37% 
8 
7.142857% 
5.115% 
2.03% 
9 
28.57143% 
4.576% 
24.00% 
Now see the positive variation where the actual frequency of 1^{st} digit exceeds the standard probable frequency of the same. This variation denotes that the transactions starting with these digits (i.e. 3,4,5,8 & 9) are potentially risky area and needs further verification and application of substantive procedures.
The auditor intends to use the digital analysis must keep in mind that the digital analysis is applicable only on the relatively large data. The relatively large data means the set of transactions should be at least more than 300 transactions. The digital analysis may not be useful for the small data and result calculated under digital analysis will not serve the purpose. Furthermore, once understood the application and using the digital analysis at first digit level same technique can be used to analyze the 2^{nd}, 3^{rd} and 4^{th} digit analysis. Normally the desired results can be obtained only by applying the 1^{st} digit analysis and we may not need to apply the 2^{nd}, 3^{rd} and 4^{th} digit analysis.
2. Beneish Model:
The Beneish model has been developed by Professor Messod D. Beneish and widely used by the auditors under modern approach to identify the potential fraud and manipulation by the companies at financial statement level. The model is comprised of some financial ratios which are used to evaluate the financial statements of the company under audit. This analysis gives a result called MScore.
The MScore has been set at 2.22, the auditors are required to calculate the MScore of the company under audit and compare it with the standard given in the Beneish Model. If the outcome of the actual analysis is less than 2.22 it means that the financials given by the company are accurate and there is zero probability of manipulation of the books of accounts used to prepare the financial statements. Whereas the greater MScore means that the financial statements have been manipulated.
In order to calculate the MScore following ratios are calculated from the financial statements under audit:
Factor 
Name 
Formula 
Basis 
DSRI 
Days sales in receivable index 
Receivables / total sales 
This year / last year 
GMI 
Gross margin index 
Gross profit / sales 
Last year/ this year 
AQI 
Asset quality index 
(Noncurrent assets – property plant equipment )/ total assets 
This year / last year 
SGI 
Sales growth index 
Total sales 
This year / last year 
DEPI 
Depreciation index 
Depreciation / (depreciation +PP &E) 
Last year/ this year 
SGAI 
Sales general and admin expenses index 
SG & A / Revenue 
This year/ last year 
TATA 
Total accruals to total assets 
(Working capital – cash) – depreciation 
This year / last year 
LVGI 
Leverage index 
Total debt / total assets 
This year / last year 
The factor/ ratios calculated as per above table then used in the following formula with the given constant values to calculate the MScore:
M = 4.84 + 0.92*DSRI + 0.528*GMI + 0.404*AQI + 0.892*SGI + 0.115*DEPI – 0.172*SGAI + 4.679*TATA – 0.327*LVGI
The above calculation of MScore is referred as 8 variable M score because it contains 8 factors for the analysis of the financial statement to identify the potential manipulation of the financial statements. There is another version called 5variable MScore can also be used to analyze the financial statements for the same purpose. In order to calculate the 5variable MScore the following formula is used:
M = 6.065 + 0.823*DSRI + 0.906*GMI + 0.593*AQI + 0.717*SGI + 0.107*DEPI
Once you have calculated all ratios calculate the MScore and compare it with the standard i.e. 2.22 and conclude that financial statements are manipulated or not.
I believe that after understanding properly the auditors will be comfortable to use the above explained techniques to identify the risky areas in the given financial statements. The reader’s comments are always welcome and encouraged for the improvement, readers can contact me for further explanation or clarification regarding the application of the above mentioned techniques.