Application of the ARIMA Models for Predicting Students’ Admissions in the University of Lagos

Main Article Content

J. N. Onyeka-Ubaka et al.


The objective was to assess the performance of the AutoRegressive Integrated Moving Average (ARIMA) models when occasional level shifts occur in the time series under study. The secondary data on the University of Lagos’ undergraduates’ admissions (1962–2016) were collected and analysed. It is predicted that universities in Nigeria and elsewhere could forecast their enrolment figures and student population growth rate based on the ARIMA models. The Box–Jenkins (B–J) approach provided the theoretical framework for the statistical analysis. The study used the Kalman Filter (KF) algorithm to develop a method using an ARIMA model to overcome and resolve the three main problems of the B–J methodology. The KF estimated the states for dynamic systems in state-variable formulations.

Forecasting university admissions is necessary if student population must match the infrastructural provisions on campuses. The best ARIMA models have been selected by using criteria such as Akaike’s Information Criterion (AIC), Schwarz’s Bayesian Criterion (SBC), Absolute Mean Error (AME), Root Mean Square Error (RMSE) and Mean Absolute Percent Error (MAPE). To select the best ARIMA model, the data was split into two periods: estimation period and validation period. The results clearly showed a continual increase in the demand for university education in the University of Lagos and, by extension, other universities in Nigeria.

Article Details

How to Cite
Onyeka-Ubaka et al., J. (2018). Application of the ARIMA Models for Predicting Students’ Admissions in the University of Lagos. Journal of Scientific Research and Development, 17(1), 80-90. Retrieved from


Anderson, B. D. O. and Moore, J. B. (1979). Optimal Filtering. Englewood Cliffs, New Jersey: Prentice-Hall, p. 353–354.
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control AC-19, p. 716–723.
Bollerslev, T. (1986). Generalized autoregressive conditional heteroskedasticity, J. Econometrics, 31: 307–327.
Box, G. E. P. and Cox, D. R. (1964). An analysis of transformations. Journal of the Royal Statistical Society, Series B 26: 211–252.
Box, G. E. P. and Tiao, G. C. (1968). A bayesian approach to some outlier problems. Biometrika, 55: 119–129.
Box, G. E. P. and Jenkins, G. M. (1976). Time Series Analysis, Forecasting and Control, Holden-Day, San Francisco, p. 185–236.
Box, G. E. P., Jenkins, G. M. and Reinsel, G. C. (1994). Time Series Analysis, Forecasting and Control, 3rd edn., Englewood Cliffs, New Jersey: Prentice-Hall, p. 85–155.
De Jong, P. and Penzer, J. (2000). The ARIMA model in State-space form, Department of Statistics, London School of Economics Houghton Street, London, WC2A 2AE, UK, p. 1–10.
Dickey, D. A. and Pantula, S. G. (1987). Determining the order of differencing in autoregressive processes, Journal of Business and Economics Statistics, 5: 455–461.
Engle, R. F. (1982). Autoregressive conditional heteroskedasticity with estimates of United Kingdom inflation. Econometrica, 50: 987–1007.
Fuller, W. A. (1976). Introduction to Statistical Time Series, NY: John Wiley & Sons, Inc., 3–6.
Hamilton, J. D. (1994). Time Series Analysis. Princeton: Princeton University Press, p. 373–399.
Harvey, A. C. (1989). Forecasting Structural Time Series Models and the Kalman Filter, Cambridge University Press, Cambridge, p. 301.
Hoff, J. C. (1983). A Practical Guide to Box–Jenkins Forecasting, Belmont, C. A: Lifetime Learning Publications, p. 316.
Javier, C., Rosario, E., Francisco, J. N. and Antonio, J. C. (2003). ARIMA models to predict next electricity price, IEEE Transactions on Power Systems, 18(3): 1014–1020.
Khashei, M., Bijari, M. and Ardal, G. A. R. (2012). Hybridization of autoregressive integrated moving average (ARIMA) with probabilistic neural networks, Computers and Industrial Engineering, 63(1): 37–45.
Klein, L. R. (1986). An Essay on the Theory of Economic Prediction, Markham, Chicago, p. 1–30.
Koopman, S. J. (1993). Disturbance smoother for state space models. Biometrika, 80: 117–126.
Kwiatkowski, D., Phillips, P. C. B., Schmidt, P. and Shin, Y. (1992). Distribution of the estimators for autoregressive time series with a unit root, J. Econometrics, 54: 159–178.
Lee, C. and Ho, C. (2011). Short-term load forecasting using lifting scheme and ARIMA model, Expert System with Applications, 38(5): 5902–5911.
Moral, P. and González, P. (2003). Univariate Time Series Modelling, Dec. 10, p. 53–147.
Onyeka-Ubaka J. N. and Abass, O. (2013). Central Bank of Nigeria (CBN) intervention and the future of stocks in the banking sector. American Journal of Mathematics and Statistics, 3(6): 407–416.
Onyeka-Ubaka J. N., Abass, O. and Okafor, R. O. (2014). Conditional variance parameters in symmetric models. International Journal of Probability and Statistics, 3(1): 1–7.
Qu, N., Dark, J. and Zhang, X. (2006). Influence Diagnostics in a Bivariate GARCH process. Monash University, Australia, p. 278–291.
Pankratz, A. (1991). Forecasting with Dynamic Regression Models, New York: John Wiley & Sons, Inc, p. 167–201.
Pearlman, J. G. (1980). An algorithm for the exact likelihood of a high-order autoregressive-moving average process, Biometrika, 67(1): 232–233.
Postcher, B. and Srinivasan, S. (1994). A comparison of order determination procedures for ARIMA models, Statistica Sinica, 4: 29–50.
Schwarz, G. (1978). Estimating the dimension of a model, Annals of Statistics, 6: 461–464.
Shim, J. K., Siegel, J. G. and Liew, C. J. (1994). Strategic Business Forecasting, Probus Publishing Company, Chicago, England, p. 152–243.
Whittle, P. (1984). Prediction and Regulation by Linear Least Square Methods 2nd edn., Oxford: Blackwell, 187pp.
Yao, Y. (1984). Estimation of a noisy discrete-time step function: Bayes and empirical bayes approaches. The Annals of Statistics, 12(4): 1434–1447.