Wednesday, December 11, 2019
Business Decision Making Analysis of Time Series
Question: Discuss about the Business Decision Making for analysis of time series? Answer: Introduction The running of a business in the competitive structure of current market requires efficient planning and their effective implementation in the organisational practice of the business. This contributes to the ability of the organisation to utilise its potentials optimally and outrun their competitors. The procedure of business decision-making works in developing the course of actions that would broaden the chance of an organisation to meets these requirements. However, business decision making is a cross-sectional process and requires the involvement of various sections like accountancy, statistics, and logical thinking in order to provide achievable and accurate decisions for the business managements. This assignment aims at elaborating this utility of business decision-making. With regard to this, the assignment highlights several data collection sources and their significance for both primary and secondary data. Furthermore, it aims at representing the range of techniques that cont ribute to effective business decision making. One significant part of this assignment is it uses various software generated information for providing a more logical and accurate presentation. A data collection plan for primary and secondary data Collecting the data in an efficient manner sets the foundation of an effective research work. With regard to this, it becomes the primary objective of a research professional to develop a data collection plan that would facilitate the succeeding works of the professional. From this aspect, it is essential that the data collection plan is reliable and stable (O'Leary, 2013, p.24). As the fundamental motive of developing this data collection plan is to support the product launch objective of a company, it needs to aim at capturing the market dynamics for the product. Precisely, the data collection plan needs to collect data that would represent both the consumer behaviour regarding coffee and the perspective of the producers. While the primary data are effective in representing the consumer behaviour, the secondary data is essential for understanding the industry insights. With regard to that, it is useful to understand the concept of primary and secondary data. The primary data refers to the data that people collect directly from the selected sources and use for further calculation (Palinkas et al. 2015, p.538). While they are effective in representing the actual image of the selected sample, it holds the great risk of errors. Therefore, the data collectors need to pay sincere attention to eliminating the chance of any bias or error. At the other extreme, the secondary variable refers to utilising data from one or more previously recorded sources. Although the chances of error, in this case, are comparatively lower, running a research entirely based on secondary data questions the credibility of a research; especially in the case of business operations and product launch (Fassinger and Morrow, 2013, p.71). The data collection plan for coffee sachet launch in London needs to concentrate on the citizens of London and the selection of the respondents needs to incorporate people from various age, social class, and ethnicity. For a collection of secondary data, they researchers can rely on various private databases and government websites. Survey methodology and sampling frame that need to be used (AC 1.2) A sample questionnaire to conduct the research (AC 1.3) Considering the competition in the current market structure, it becomes essential for every organisation to understand the market dynamics for experiencing a successful product launch. The process of understanding these market dynamics requires the researchers to focus on the interaction between two prime elements of the market; name the customers and the brands performing in the market. Therefore, the questionnaire needs to incorporate the concerns of both the customers and the brand managements. Amount Spent () Mid- point No. of Orders (New frequency) (New Cumulative frequency) (x) (f) (fx) (cf) 0.5- 10 5.25 7 36.75 7 10-20 15 9 135 16 20-30 25 12 300 28 30-40 35 14 490 42 40-50 45 16 720 58 50-60 55 17 935 75 60-70 65 16 1040 91 70-80 75 15 1125 106 80-90 85 8 680 114 90-100 95 6 570 120 Total 120 6031.75 Table 1: Measures of central tendency (Source: Learner) Amount Spent () No. of Orders (New frequency) (f) (fx) 0.5-10 7 36.75 10-20 9 135 20-30 12 300 30-40 14 490 40-50 16 720 50-60 17 935 60-70 16 1040 70-80 15 1125 80-90 8 680 90-100 6 570 Table 2: Calculation of Mean (Source: Learner) Formula for Mean = Sum(fx)/Sum(f) Considering the above data, mean = 6031.75/120 = 50.26 Median Amount Spent () No. of Orders (New Cumulative frequency) (f) (cf) 0.5- 10 17 17 10-20 16 33 20-30 16 49 30-40 15 64 40-50 14 78 50-60 12 90 60-70 9 99 70-80 8 107 80-90 7 114 90-100 6 120 Total 120 657 Table 3: Calculation of median (Source: Learner) Formula for Median = L + [(n/2 - CF)/f]*i; where L= Lower boundary of the median class, f= frequency of the median class, CF= cumulative frequency of median class, i = median class interval Median class to be considered for the above given data is the range of 50-60 Hence, median = 50 + [(120/2 90)/12]*10 = 50 + [-30/12]*10 = 50 - 25 = 25 Mode Amount Spent () No. of Orders (New Cumulative frequency) (f) (cf) 0.5- 10 7 7 20-Oct 9 16 20-30 12 28 30-40 14 42 40-50 16 58 50-60 17 75 60-70 16 91 70-80 15 106 80-90 8 114 90-100 6 120 Total 120 Table 3: Calculation of mode (Source: Learner) Formula for mode = l + h[(fm-f1)/(2fm - f1 f2)], where, l = lower limit of the moda range, h = class range, fm = frequency of the modal class, f1 = frequency of the proceeding class, f2 = frequency of the preceding class. From the above data, it could be observed that the highest frequency is 17 and hence, the modal class is considered to be of range 50-60 Therefore, Mode = 50 + 10 * [(17-16)/2*17 16 - 16] = 50 + (10 * ) = 50 + 5 = 55 Analysis of the results in order to provide advises (AC 2.2) Mean is the considered to be the average value of the values in the given data which is computed by summing up the values and then dividing them by the number of observations (Park et al. 2016, p.230). The average value of the amount spent is said to be 50. 26. Median is the middlemost value amongst a broad set of different or same values that is computed by arranging the frequencies in an descending order and choosing the value in the middle (Chatfield, 2016, p.43). The median value is 25 as per the calculations. Mode is that value which possesses highest frequency and might occur more than one time (Vakharia et al. 2016, p.1615). Here, in the above set of values, the given mode is 55 and hence the most popular price range is 55. Calculating certain descriptive statistic measures (AC 2.3) Range Amount Spent () 0.5- 10 10-20 20-30 30-40 40-50 50-60 60-70 70-80 80-90 90-100 Total Table 4: Calculation of range (Source: Learner) Range is considered to be the difference between the upper and lower limit of any data range. Range for the given set of values = Upper limit- Lower limit = 100 0.5 = 99.5 Standard deviation Amount Spent () Mid- point No. of Orders (New frequency) (fx times x) (x) (f) (fx) (fx) 0.5- 10 5.25 7 36.75 192.9375 10-20 15 9 135 2025 20-30 25 12 300 7500 30-40 35 14 490 17150 40-50 45 16 720 32400 50-60 55 17 935 51425 60-70 65 16 1040 67600 70-80 75 15 1125 84375 80-90 85 8 680 57800 90-100 95 6 570 54150 Total 120 6031.75 374617.938 Table 5: Computation of Standard deviation (Source: Learner) Standard deviation for the above data = [Sum (fx2)/ sum (f) {Sum (fx)/ sum (f)}2]^1/2 = [(374617.938/120)-{(6031/120)2}]^1/2 = (3121.816 2525.90)^1/2 = 24.41 Lower quartile Lower quartile = L1 + {(N/4 F1)/f1}*c, where N= 100, F1 = cumulative frequency of the range Now, N/4 = 100/4 = 25, hence, lower quartile range = 20-30 Therefore, lower quartile = 20 + {(25 28)/12}*10 = 20 + (-1/4)*10 = 20 2.5 = 17.5 Therefore, lower quartile = 17.5 Upper quartile Upper quartile = L3 + {(3N/4 F3)/f3}*c , where N= 100, F1 = cumulative frequency of the range 3N/4 = 3*100/4 = 75 Hence, the upper quartile range is 70-80 Therefore, upper quartile = 70 + {(75 106)/15}*10 = 70 + (-1/4)*10 = 20 2.066 = 17.934 Inter-quartile range Inter quartile range = (Q3 + Q1)/2 = (17.934 + 17.5)/2 = 17.717 Calculating the correlation coefficient and commenting on the result (AC 2.4) Correlation coefficient is considered to be the statistical measure that is used to measure the degree of relationship among the variables which might predict the changes in values Chatfield, 2016, p.65). The usefulness of correlation coefficient while analysing any business sales lies in the fact that it indicates the absence or presence of any relationship but not the characteristic features of any relationship. Temperature Sales Column1 X xX Y x Y X Y XY X Y 20 320.00 6400 400 102400 24 411.00 9864 576 168921 11 192.00 2112 121 36864 17 259.00 4403 289 67081 9 170.00 1530 81 28900 15 243.00 3645 225 59049 25 430.00 10750 625 184900 2025 38704 2317 648115 Table 6: Computation of Correlation coefficient (Source: Learner) Quantiles represent the slabs or division of ranges in the data set of any particular variate that is used to divide the distribution of frequency into equal groups of values that contain equivalent fraction of total number of observation or population. Correlation Coefficient = [n*sum (XY) {Sum(X)*Sum(Y)}] / [n*sum(X2)-{Sum(X)}2]*[n*Sum(Y2) {Sum(Y)}2] = 10*38704 {(121*2025)}/[10*2317 - 1212] * [10*648115 - 20252] = [387040 245025]/ [23170 - 14641] * [6481150 - 4100625] = 142015 / (8529 * 2380525) = 6.994*10-6 Developing line and bar diagrams and concluding on the behaviour of sales, costs, and profit In order to derive bar and line diagrams, the following data set has been considered that displays the sales, profit and cost of Graham Consultants Limited. The bar chart and line diagrams have been considered one of the most important technique through which proper and effective analysis could be performed (Pan et al. 2016, p.55). It is easier and less time consuming for the researchers and analysts to derive sufficient analysis on the performance of the company through drawing out bar chart and line diagram. What's App Viber: 0091 9831 868 025 Year Sales All Costs - (Direct and Indirect) Final Profit 2000 165,000.00 135,000.00 45,000.00 2001 185,000.00 125,000.00 65,000.00 2002 225,000.00 145,000.00 85,000.00 2003 235,000.00 145,000.00 95,000.00 2004 315,000.00 175,000.00 145,000.00 2005 335,000.00 175,000.00 165,000.00 2006 265,000.00 165,000.00 105,000.00 2007 245,000.00 145,000.00 105,000.00 2008 245,000.00 175,000.00 75,000.00 2009 255,000.00 155,000.00 105,000.00 2010 295,000.00 155,000.00 145,000.00 Table 7: Table displaying sales, cost and profit (Source: Learner) The bar chart that has been derived based on the above data set for 11 years from 2000 till 2010 is presented as follows: Figure 1: Bar diagram for sales, cost and profit (Source: Learner) The line diagram based on the above data set for Graham Consultants Limited has been presented as follows: Figure 2: Line diagram for sales, cost and profit (Source: Learner) From the above bar chart and the line diagram, it could be observed that sales are much higher than profits that have been accumulated and cost incurred. It has been also seen that though sales is higher yet the costs are too high to generate sufficient profit in the firm. Hence, it is suggested to lower the costs in order to generate higher profit and revenue since, profit is calculated as the difference between sales and cost. Developing trend lines for forecasting the performance of sales, costs, and profit on next three years (AC 3.2 and AC 4.1) In order to derive the trend lines, the forecast for three years on sales, cost and profit has been effectively computed as the average of the variances in the values of each of the sales, profit and cost values. Year Sales All Costs Final Profit 2000 165000 135000 45000 2001 185000 125000 65000 2002 225000 145000 85000 2003 235000 145000 95000 2004 315000 175000 145000 2005 335000 175000 165000 2006 265000 165000 105000 What's App Viber: 0091 9831 868 025 2007 245000 145000 105000 2008 245000 175000 75000 2009 255000 155000 105000 2010 295000 155000 145000 2011 306818 158636 145909 2012 318636 162272 146818 2013 330454 165908 147727 Table 8: Table displaying sales, cost and profit for extended three years (Source: Learner) The trend line derived through the above table has been drawn out as follows: Figure 3: Trend lines analysing sales, cost and profit (Source: Learner) After computing sales, profit and cost values for the next three years 2011, 2012 and 2013, the regression equation has been obtained and the R- square value has been derived as well. In the words of Zhao et al. (2016, p.233), through the regression analysis, it can be clearly stated that the R square value is not effective since it is quite low and hence can be thereby said that the model is not of good fit. This is because not even half percentage of independent variables, which are sales and cost, explain dependent variable profit. A PowerPoint presentation on the findings with recommendation for the CEO (AC 3.3) A formal business report to the regional managers (AC 3.4) To The Regional Managers Sales and costs are considered independent variables and profit is considered to be dependent variable since it depends on both sales and cost. Since, cost is higher as compared to profit hence; it is showing an inverse linear relationship among profits and cost. From the graphs, it can be observed that the sales are much higher and the profit varies with cost. Higher profit can be generated only if lower cost is incurred. Forecasting may be defined as calculating an estimate of sales, costs and profit in future years that depicts a financial trend. According to Li et al. (2016, p.66), forecasting is mainly done in order to predict future values and prepare a financial analysis about the overall performance of the firm. The benefits of forecasting include that the firm might anticipate profits, accordingly lower the costs, and increase sales to improve performance of the company. Some fruitful recommendations that could be suggested for further improvement include- The company must focus on increasing its profit by diminishing cost and expenditures. It is recommended that the company must analyse and estimate its errors to bring a positive and effective trend. The company shall improve techniques of attracting customers in order to increase sales so that the profit curve increases more than cost that is incurred. It is suggested that the independent as well as dependent variables must be more interlinked and interdependent to each other in order to increase the dependency level. Calculating the project plan and identifying the critical path (AC 4.2) Absolute and clear accomplishment of any project requires drawing Gantt chart and analysing critical path through network diagram that could produce or yield better results and outcomes of analysis. According to the suggestions of Kumar and Chakraborty (2016, p.17), in order to derive, the critical path of the given activities, network diagram has been created. Description Activities Preceding activities Time (days) Preparation A - 6 Business planning B A 4 Recruitment and selection C A 38 Installation of peripherals D B 17 Initial training E D 6 Design F E 11 Conversion G F 11 Development of norms H C 4 Assessments I B 12 Continuous testing J D 11 Policy documentation K G, H, I, J 22 Appraisal L K 22 Table 9: Calculation of network diagram (Source: Learner) Figure 4: Gantt Chart for QWM Investments Limited (Source: Learner) Figure 5: Network Diagram for QWM Investments Limited (Source: Learner) By drawing the above network diagram, the critical path of the selected project can be computed as = (6 + 4 + 17 + 6 + 11 + 11 + 22) days = 77 days. Calculating the payback, NPV, and IRR for project Alistair and project Bromley Each business organisation needs to go through certain logical analysis in order to decide on the selection of particular project from the options available to them. There exist certain techniques that facilitate this analysis. The decision makers manifestly prioritise on the calculation of payback period, net present value (NPV), and internal rate of return (IRR). Therefore, in order to understand the functioning of business decision making, it becomes essential to understand the concepts of these techniques. The payback period refers to the time required in regaining the amount that an organisation invests in an asset from the outflow generated by the asset. In this context, the decision makers focus on both simple and discounted payback. The predominant difference between the simple and discounted payback is the discounted payback considers the time value of money, unlike simple payback. The formula for simple payback is: Payback = Initial investment/Cash flow per period However, as the projects, in this case, require uneven cash flow, it involves a comparatively complex calculation. The payback calculation for the available project is given in the form of the following tables. Year Project Alistair Present Discounted Discounted Cash flow () value Cumulative cumulative cash flow factor cash flow cash flow Investment -300,000 1 -300000 -300,000 -300000 1 55,000 Discount 0.9090909 1 50000 -245,000 -250000 factor = 2 100,000 0.8264462 10% 8 82644.6281 -145,000 -167355.3719 3 110,000 0.7513148 82644.6281 -35,000 -84710.7438 4 95,000 0.6830134 6 64886.27826 60,000 -19824.46554 5 40,000 0.6209213 2 24836.85292 100,000 5012.38738 Payback period = 3.37 Discounted payback period = 4.5 Table 1: Payback period of project Alistai year Project Bromley Present Discounted Discounted Cash flow () value Cumulative cumulative cash flow factor cash flow cash flow Investment -300,000 1 -300000 -300,000 -300000 1 220,000 Discount 0.9090909 1 200000 -80,000 -100000 factor = 2 50,000 0.8264462 10% 8 41322.31405 -30,000 -58677.68595 3 50,000 0.7513148 37565.74005 20,000 -21111.94591 4 20,000 0.6830134 6 13660.26911 40,000 -7451.676798 5 20,000 0.6209213 2 12418.42646 60,000 4966.749663 Payback period = 2.6 Discounted payback period = 4.6 Table 2: Payback period of project Bromley In the case of NPV, it is effective in presenting the present value of the return amount of a project by using the cost of capital associated with the project. Therefore, the organisations calculate the NPV of available projects and select the project with highest NPV value as it represents the optimum present value of the projects. The formula for NPV is: NPV = (investment) + CF1/(1+k) + CF2/(1+k)2 +...+ CFt/(1+k)t The concept of rate of return is correlated to the concept of NPV. IRR allows the decision makers to identify the rate of return at which the NPV of the projects is zero. The NPV and IRR value for the given projects are presented in the tables below. Year Project Alistair Cash flow () Investment -300,000 1 55,000 2 100,000 3 110,000 4 95,000 5 40,000 NPV= 4556.72 IRR= 11% Table 3: NPV and IRR for project Alistair Year Project Bromley Cash flow () Investment -300,000 1 220,000 2 50,000 3 50,000 4 20,000 5 20,000 NPV= 4515.23 IRR= 11% Table 4: NPV and IRR for project Bromley Writing a brief report to the board members with recommendations on choosing one project over the other The report addresses the board members with regard to supporting their motive of selecting a project that would appear to be most profitable for the organisation. Every organisation pursues a sole motive of optimising their profit. In relation to this, it is crucial for them to ensure optimum effectiveness in case of choosing the most profitable projects. With regard to choosing one project between project Alistair and project Bromley, the organisation management needs to focus on the profitability factors associated with each of these projects. Concerning that, the board of directors must pay attention to the NPV and IRR values, and the payback periods of these projects as these tools determine the return from a project and thereof, represents profitability of the project. It has been noticed that the IRR values are same for both the projects. However, the NPV value is higher for project Alistair. In addition, the discounted payback period, which considers the time value of money, is lower for this project. Therefore, the organisation needs to choose project Alistair over project Bromley. Conclusion The assignment of business decision making consists of five tasks each of which is based on statistical analysis and computations. In the very first task, a questionnaire has been framed and designed to understand the survey results in a better and effective way followed by the next task that consists of computing different measures of central tendency and measures of dispersion. Followed by task 2, trend lines of sales, cost and profit is derived to analyse the performance of the company. From the next task, critical path and network diagram of the project has been calculated followed by calculation of capital budgeting tools to make an appropriate choice among the projects. References Chatfield, C., (2016). The analysis of time series: an introduction. Florida: CRC press. Fassinger, R. and Morrow, S., (2013). Toward best practices in quantitative, qualitative, and mixed-method research: A social justice perspective. Journal for Social Action in Counseling and Psychology, 5(2), pp.69-83. Kumar, A. and Chakraborty, B.S., (2016). Application of critical path analysis in clinical trials. Journal of advanced pharmaceutical technology research, 7(1), p.17. Li, W.Y., Chow, P.S., Choi, T.M. and Chan, H.L., (2016). Supplier integration, green sustainability programs, and financial performance of fashion enterprises under global financial crisis. Journal of Cleaner Production, 135, pp.57-70. O'Leary, Z., (2013). The essential guide to doing your research project. Sage. Palinkas, L.A., Horwitz, S.M., Green, C.A., Wisdom, J.P., Duan, N. and Hoagwood, K., (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 42(5), pp.533-544. Pan, J.N., Li, C.I. and Shih, W.C., (2016). New multivariate process capability indices for measuring the performance of multivariate processes subject to non-normal distributions. International Journal of Quality Reliability Management, 33(1), pp.42-61. Park, H.S., Lee, M., Kang, H., Hong, T. and Jeong, J., (2016). Development of a new energy benchmark for improving the operational rating system of office buildings using various data-mining techniques. Applied Energy, 173, pp.225-237. Vakharia, V., Gupta, V.K. and Kankar, P.K., (2016). A comparison of feature ranking techniques for fault diagnosis of ball bearing. Soft Computing, 20(4), pp.1601-1619. Zhao, X., Xu, Y., Zhao, R. and Wu, J., (2016), February. A mixed model for open-end fund performance evaluation. In 2016 Eighth International Conference on Advanced Computational Intelligence (ICACI) (pp. 230-235).
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment