Show simple item record

dc.contributor.authorWang, Xinen_NZ
dc.identifier.citationWang, X. (2005, November 15). Research of mixture of experts model for time series prediction (Thesis, Doctor of Philosophy). University of Otago. Retrieved from
dc.descriptionxxiv, 237 leaves :ill. ; 30 cm. Includes bibliographical references. University of Otago department: Information Science. "15 November 2005".
dc.description.abstractFor the prediction of chaotic time series, a dichotomy has arisen between local approaches and global approaches. Local approaches hold the reputation of simplicity and feasibility, but they generally do not produce a compact description of the underlying system and are computationally intensive. Global approaches have the advantage of requiring less computation and are able to yield a global representation of the studied time series. However, due to the complexity of the time series process, it is often not easy to construct a global model to perform the prediction precisely. In addition to these approaches, a combination of the global and local techniques, called mixture of experts (ME), is also possible, where a smaller number of models work cooperatively to implement the prediction. This thesis reports on research about ME models for chaotic time series prediction. Based on a review of the techniques in time series prediction, a HMM-based ME model called "Timeline" Hidden Markov Experts (THME) is developed, where the trajectory of the time series is divided into some regimes in the state space and regression models called local experts are applied to learn the mapping on the regimes separately. The dynamics for the expert combination is a HMM, however, the transition probabilities are designed to be time-varying and conditional on the "real time" information of the time series. For the learning of the "time-line" HMM, a modified Baum—Welch algorithm is developed and the convergence of the algorithm is proved. Different versions of the model, based on MLP, RBF and SVM experts, are constructed and applied to a number of chaotic time series on both one-step-ahead and multi-step-ahead predictions. Experiments show that in general THME achieves better generalization performance than the corresponding single models in one-step-ahead prediction and comparable to some published benchmarks in multi-step-ahead prediction. Various properties of THME, such as the feature selection for trajectory dividing, the clustering techniques for regime extraction, the "time-line" HMM for expert combination and the performance of the model when it has different number of experts, are investigated. A number of interesting future directions for this work are suggested, which include the feature selection for regime extraction, the model selection for transition probability modelling, the extension to distribution prediction and the application on other time series.en_NZ
dc.publisherUniversity of Otago
dc.subjectchaotic time seriesen_NZ
dc.subject“Timeline" Hidden Markov Expertsen_NZ
dc.subjectmulti-step-ahead predictionen_NZ
dc.subjectdistribution predictionen_NZ
dc.subjecttime seriesen_NZ
dc.subjectMixture of Expertsen_NZ
dc.subjectMarkov processes
dc.subject.lcshT Technology (General)en_NZ
dc.subject.lcshQ Science (General)en_NZ
dc.subject.lcshHG Financeen_NZ
dc.subject.lcshHF5601 Accountingen_NZ
dc.titleResearch of mixture of experts model for time series predictionen_NZ
otago.schoolInformation Scienceen_NZ Scienceen_NZ of Philosophy of Otagoen_NZ Thesesen_NZ
otago.openaccessAbstract Only
dc.identifier.eprints389en_NZ Scienceen_NZ
dc.description.referencesAronszajn, N. (1950), Theory of reproducing kernels. Transactions of the American Mathematical Society, 68,337-404. Atiya, A. F., El-Shoura, S. M., Shaheen, S. I., and El-Sherif, M. S. (1999), A comparison between neural-network forecasting techniques — Case study: river flow forecasting. IEEE Transactions on Neural Networks, 10(2), 402-409. Atkeson, C. G. (1992), Memory-based approaches to approximating continuous functions. Nonlinear Modeling and Forecasting, M. Casdagli and S. Eubank, eds., Addison-Wesley, New York, 503-521. Atkeson, C. G., Moore, A. W., and Schaal, S. (1997), Locally weighted learning. Articial Intelligence Review, 11,11-73. Aupetit, M., Couturier, P., and Massotte, P. (2000), Function approximation with continuous self-organizing maps using neighboring influence interpolation. Proceedings of ICSC Symposia on Neural Computation (NC'2000), Berlin, Germany. Bakker, R., Schouten, J. C., Giles, C. L., Takens, F., and Van den Bleek, C. M. (2000), Learning chaotic attractors by neural networks. Neural Computation, 12(10), 2355– 2383. Barron, A. R. (1993), Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory, 39(3), 930-945. Baum, L. E. (1972), An inequality and associated maximization technique occuring in the statistical analysis of probabilistic functions of Markov chains. Inequalities, 3,1-8. Baum, L. E., Petrie, T., Soules, G., and Weiss, N. (1970), A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. Annals of Mathematical Statistics, 41,164-171. Bellman, R. E. (1961), Adaptive control processes : A guided tour, Princeton University Press, Princeton, N.J. Bengio, Y., and Frasconi, P. (1995), An input output HMM architecture. Advances in Neural Information Processing Systems 7, G. Tesauro, D. S. Touretzky, and T. K. Leen, eds., MIT Press, Cambridge, MA, 427-434. Bengio, S., Fessant, F., and Collobert, D. (1996), Use of modular architectures for timeseries prediction. Neural Processing Letters, 3(2), 101-106. Bengio, Y., and Frasconi, P. (1996), Input/Output HMMs for sequence processing. IEEE Transactions on Neural Networks, 7(5), 1231-1249. Bengio, Y., Lauzon, V., and Ducharme, R. (2001), Experiments on the application of IOHMMs to model financial return series. IEEE Transactions on Neural Networks, 12(1), 113-123. Bersini, H., Birattari, M., and Bontempi, G. (1998), Adaptive memory-based regression methods. Proceedings of the 1998 IEEE International Joint Conference on Neural Networks, 2102-2106. Bezdek, J. (1981), Pattern recognition with fuzzy objective function algorithms, Plenum Press, New York. Bilmes, J. A. (1998), A gentle tutorial of the EM algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models. International Computer Science Institute, Berkeley, CA. Bishop, C. M. (1990), Curvature-driven smoothing in back-propagation neural networks. Proceedings of International Neural Networks Conference (INNC'90), 749– 752. Bone, R., and Crucianu, M. (2002), Multi-step-ahead prediction with neural networks: A review. Wines rencontres internationales: Approches Connexionnistes en Sciences, Boulogne sur Mer, France, 97-106. Bontempi, G., Birattari, M., and Bersini, H. (1999), Local learning for iterated time series prediction. Machine Learning: Proceedings of the Sixteenth International Conference, San Francisco, CA, 32-38. Box, G. E. P., and Jenkins, G. M. (1970), Time series analysis, forecasting and control, Holden-Day. Breiman, L., Friedman, J. H., Olshen, R. A., and Stone, P. J. (1984), Classification and regression trees, Wadsworth International Group, CA. Bridle, J. (1989), Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition. Neurocomputing: Algorithms, Architectures and Applications, F. Fogelman Souli'e and J. H'erault, eds., Springer-Verlag, 227-236. Broomhead, D. S., and Lowe, D. (1988), Multivariable function interpolation and adaptive networks. Complex Systems, 2,321-355. Burges, C. C. J. (1998), A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2,121-167. Cao, L. (2003), Support vector machines experts for time series forecasting. Neurocomputing, 51,321-339. Casdagli, M. (1989), Nonlinear prediction of chaotic time series. Physica D, 35,335– 356. Chan, K.-S., and Tong, H. (2001), Chaos: a statistical perspective, Springer-Verlag, New York. Carroll, T. L. (1998), Multiple Attractors and Periodic Transients in Synchronized Chaotic Circuits. Physices Letter A, 238,365-368. Cheeseman, P., Kelly, J., Self, M., Stutz, J., Taylor, W., and Freeman, D. (1988a), AutoClass: a Bayesian classification system. Proceedings of the Fifth International Conference on Machine Learning, 54-64. Cheeseman, P., Stutz, J., Self, M., Kelly, J., Taylor, W., and Freeman, D. (1988b), Bayesian classification. Proceedings of the Seventh National Conference of Artificial Intelligence, 607-611. Cheeseman, P., and Stutz, J. (1996), Bayesian Classification (AutoClass): theory and results. Advances in Knowledge Discovery and Data Mining, U. M. Fayyad, G. Piatetsky-Shapiro, P. Smyth, and R. Uthurusamy, eds., American Association for Artificial Intelligence Press/MIT Press, Menlo Park, CA, USA, 153-180. Chen, H., and Liu, R.-W. (1992), Adaptive distributed orthogonalization processing for principal components analysis. Proceedings of International Conference on Acoustics, Speech and Signal Processing, San Francisco, CA, 293-296. Chen, R. (1995), Threshold variable selection in open-loop threshold autoregressive models. Journal of Time Series Analysis, 16,461-481. Chen, S., Billings, S. A., and Grant, P. M. (1990), Non-linear system identification using neural networks. International Journal of Control, 51,1191-1214. Chudy, L., and Farkas, I. (1998), Prediction of chaotic time-series using dynamic cell structures and local linear models. Neural Network World, 8(5), 481-489. Cleveland, W. (1979), Robust locally weighted regression and smoothing scatterplots. Journal of the American Statistical Association, 74,829-836. Cleveland, W., and Devlin, S. (1988), Locally weighted regression: An approach to regression analysis by local fitting. Journal of the American Statistical Association, 83, 596-609. [39] Cortes, C., and Vapnik, V. (1995), Support-vector networks. Machine Learning, 20(3), 273-297. Cottrell, B. M., Girard, Y., Mangeas, M., and Muller, C. (1995), Neural modeling for time series: a statistical stepwise method for weight elimination. IEEE Trans on Neural Networks, 6(6), 1355-1364. Crowder, R. (1990), Predicting the Mackey-Glass time series with cascade correlation learning. The Connectionists Models Summer School, 117-123. Cybenko, G. (1989), Approximation by superpositions of a sigmoid function. Mathematics of Control, Signals and Systems, 2,303-314. Dangelmayr, G., Gada]eta, S., Hundley, D., and Kirby, M. (1999), Time series prediction by estimating Markov probabilities through topology preserving maps. SPIE Vol. 3812, Applications and Science of Neural Networks, Fuzzy Systems, and Evolutionary Computation II, 86-93. De Groot, C., and Wurtz, D. (1991), Analysis of univariate time series with connectionist nets. A case study of two classical examples. Neurocomputing, 3(4), 177– 192. Dempster, A., Laird, N., and Rubin, D. (1977), Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B(39), 1-38. Deppisch, J., Bauer, H.-U., and Geisel, T. (1991), Hierarchical training and its application to dynamical systems and prediction of chaotic time series. Physics Letters, 158,57-62. Der, R., and Herrmann, M. (1994), Nonlinear chaos control by neural nets. Proceedings of International Conference on Artificial Neural Networks (ICANN'94), 1227-1230. Devijver, P., and Kittler, J. (1982), Pattern recognition. A statistical approach, Prentice Hall, Englewood Cliffs. Drucker, H., Burges, C., Kaufman, L., Smola, A. J., and Vapnik, V. (1997), Support vector regression machines. Advances in Neural Information Processing Systems 9, M. Mozer, M. Jordan, and T. Petsche, eds., MIT Press, Cambridge, MA. Drucker, H., Wu, D., and Vapnik, V. (1999), Support vector machines for spam categorization. IEEE Transactions on Neural Networks, 10(5), 1048-1054. Elman, J. (1990), Finding structure in time. Cognitive Science, 14,179-211. Elsner, J. B. (1992), Predicting time series using a neural network as a method of distinguishing chaos from noise. Journal of Physics A: Mathematical and General, 25, 843-850. Engle, R. F. (1982), Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica, 50,987-1007. Epanechnikov, V. A. (1969), Nonparametric estimation of a multivariate probability density. Theory of Probability and its Applications, 14,153-158. Fair, R. C, and Jaffee, D. M. (1972), Methods of estimation for markets in disequilibrium. Econometrica, 40,497-514. Fan, J., and Yao, Q. (2003), Nonlinear time series, nonparametric and parametric methods, Springer-Verlag, New York. Fan, J., and Gijbels, I. (1996), Local polynominal modelling and its application, Chapman and Hall, London. Farmer, J. D., and Sidorowich, J. J. (1987), Predicting chaotic time series. Physical Review Letters, 59(8), 845-848. Farmer, J. D., and Sidorowich, J. J. (1988), Exploiting chaos to predict the future and reduce noise. Evolution, Learning and Cognition, Y. C. Lee, eds., World Scientific Press, 277-330. Fernandez, R. (1999), Predicting time series with a local support vector regression machine. Advanced Course on Artificial Intelligence 99. Flake, G. W., and Lawrence, S. (2002), Efficient SVM regression training with SMO. Machine Learning, 46(1-3), 271-290. Fletcher, R. (1987), Practical methods of optimization, Jon Wiley and Sons. Fraser, A. M., and Dimitriadis, A. (1994), Forecasting probability densities by using hidden Markov models. Time Series Prediction: Forecasting the Future and Understanding the Past, A. S. Weigend and N. A. Gershenfeld, eds., Addison-Wesley, MA, 265-282. Friedman, J. (1991), Multivariate adaptive regression splines. Annals of Statistics, 19, 1-142. Friedman, J. H. (1994), An overview of predictive learning and function approximation. From Statistics to Neural Networks, V. Cherkassky, J. H. Friedman, and H. Wechsler, eds., Springer-Verlag, 1-61. Funahashi, K. (1989), On the approximate realization of continuous mappings by neural networks. Neural Networks, 2,183-192. Geladi, P., and Kowalski, B. R. (1986), Partial least squares regression: a tutorial. Analytica Chimica Acta, 185(1), 1-17. Geman, S., Bienestock, E., and Doursat, R. (1992), Neural networks and the bias/variance dilemma. Neural Computation, 4,1-58. Gers, F. A., Eck, D., and Schmidhuber, J. (2001), Applying LSTM to time series predictable through time-window approaches. Proceeding of International Conference on Artificial Neural Networks (ICANN 2001), Vienna, Austria, 669-675. Girosi, F. (1997), An equivalence between sparse approximation and support vector machines. MIT Artificial Intelligence Laboratory. Goldfeld, S. M., and Quandt, R. (1972), Nonlinear methods in econometrics, North- Holland Publishing Co., Amsterdam. Goldfeld, S. M., and Quandt, R. (1973), A Markov model for switching regressions. Journal of Econometrics, 1,3-16. Gorr, W. L. (1994), Research prospective on neural network forecasting. International Journal of Forecasting, 10(1), 1-4. Gray, S. F. (]996), Modelling the conditional distribution of interest rates as a regimeswitching process. Journal of Financial Economics, 42,27-62. Grosse, E. (1989), LOESS: Multivariate smoothing by moving least squares. Approximation Theory, C. K. Chui and L. L. Schumaker, eds., Academic Press, 299-302. Hamilton, J. D. (1990), Analysis of time series subject to changes in regime, Journal of Econometrics, 45,39-70. Hamilton, J. D., and Susmel, R. (1994), Autoregressive conditional heteroskedasticity and changes in regime. Journal of Econometrics, 64,307-333. Hardie, W. (1990), Applied nonparametric regression, Cambridge University Press, Cambridge. Haykin, S. (1999), Neural networks. A comprehensive foundation, Macmillan College Publishing Company, N. J. Hill, T., Marquez, L., O'Connor, M., and Remus, W. (1994), Artificial neural network models for forecasting and decision making. International Journal of Forecasting, 10, 5-15. Hinton, G. E. (1989), Connectionist learning procedures. Artificial Intelligence, 40, 185-234. Hornik, K., Stinchcombe, M., and White, H. (1989), Multilayer feedforward networks are universal approximations. Neural Networks, 2,359-366. Hu, M. J. C. (1964), Application of the adaline system to weather forecasting, Technical Report 6775-1, Stanford Electronic Lab, Stanford, CA. Huber, P. J. (1964), Robust estimation of a location parameter. Annals of Mathematical Statistics, B(35), 73-101. Huber, P. J. (1981), Robust statistics, Wiley, New York. Htibner, U., Weiss, C. 0., Abraham, N. B., and Tang, D. (1994), Lorenz-like chaos in NH 3-FIR lasers. Time Series Prediction: Forecasting the Future and Understanding the Past, A. S. Weigend and N. A. Gershenfeld, eds., Addison-Wesley, MA, 73-104. Ikeda, K. (1979), Multiple-valued stationary state and its instability of the transmitted light by a ring cavity system. Optics Communications, 30(2), 257-261. Inoue, H., Fukunaga, Y., and Narihisa, H. (2001), Efficient hybrid neural network for chaotic time series prediction. Proceedings of International Conference on Artificial Neural Networks (ICANN 2001), 712-718. Jacobs, R. A. (1997), Bias/Variance analyses of mixtures-of-experts architectures. Neural Computation, 9,369-383. Jacobs, R. A., Jordan, M. I., Nowlan, S. J., and Hinton, G. E. (1991), Adaptive mixtures of local experts. Neural Computation, 3,79-87. Jang, J.-S. R. (1993), Anfis: Adaptive-network-based fuzzy inference system. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, 23(3), 665-685. Jones, R., Lee, Y., Barnes, C., Flake, G., Lee, K., and Lewis, P. (1990), Function approximation and time series prediction with neural networks. Proceedings of International Joint Conference on Neural Networks (IJCNN1990), 649-665. Jordan, M. I. (1986), Attractor dynamics and parallelism in a connectionist sequential machine. Eighth Annual Conference of the Cognitive Science Society, Englewood Cliffs, NJ: Erlbaum, 531-546. Jordan, M. I., and Xu, L. (1995), Convergence results for the EM approach to mixtures of experts architectures. Neural Networks, 8,1409-1431. Kasabov, N., and Song, Q. (2002), DENFIS: dynamic evolving neural-fuzzy inference system and its application for time series prediction. IEEE Transactions on Fuzzy Systems, 10(2), 144-154. Kohonen, T. (1982), Self-organized formation of topologically correct feature maps. Biological Cybernetics, 43,59-69. Lapedes, A., and Farber, R. (1988), How neural nets work. Evolution learning and cognition, Y. C. Lee, eds., World Scientific, 331-346. LeCun, Y. A., Jackel, L. D., Bottou, L., Brunot, A., Cortes, C., Denker, J. S., Drucker, H., Guyon, I., Muller, U. A., Sackinger, E., Simard, P., and Vapnik, V. N. (1995), Learning algorithms for classification: a comparison of learning algorithms for handwritten digit recognition. Neural Networks: The Statistical Mechanics Perspective, J. H. Oh, C. Kwon, and S. Cho, eds., World Scientific, 261-276. Liehr, S., Pawelzik, K., Kohlmorgen, J., Lemm, S., and Muller, K.-R. (1999), Hidden Markov mixtures of experts for prediction of non-stationary dynamics. Proceedings of Neural Networks for Signal Processing IX, IEEE, NJ, 195-204. Liporace, L. A. (1982), Maximum likelihood estimation for multivariate observations of Markov source. IEEE Transactions on Information Theory, 28(5), 729-734. Lippmann, R. (1989), Pattern classification using neural networks. IEEE Communications Magazine, 27(11), 47-64. Littmann, E., and Ritter, H. (1996), Learning and generalization in cascade network architectures. Neural Computation, 8,1521-1539. Lorenz, E. N. (1963), Deterministic non-periodic flows. Journal of Atmospheric Science, 20,130-141. Lowe, D., and Webb, A. R. (1994), Time series prediction by adaptive networks: A dynamical systems perspective. Artificial Neural Networks, Forecasting Time Series, V. R. Vemuri and R. D. Rogers, eds., IEEE Computer Society Press, 12-19. Mackey, M. C., and Glass, L. (1977), Oscillations and chaos in physiological control systems. Science, 197,287-289. Martinez, T., Berkovich, S., and Schulten, G. (1993), "Neural-gas" network for vector quantization and its application to time-series prediction. IEEE Transactions on Neural Networks, 4,558-569. Mattera, D., and Haykin, S. (1999), Support vector machines for dynamic reconstruction of a chaotic system. Advances in Kernel Methods — Support Vector Learning, B. Scholkopf, C. Burges, and A. Smola, eds., MIT Press, 211-242. McCullagh, P., and Nelder, J. A. (1989), Generalised linear models, monographs on statistics and applied probability, Chapman and Hall, London. MeNames, J. (1999), Innovations in local modeling for time series prediction, Ph.D. thesis, Stanford University. McNames, J., Suykens, J., and Vandewalle, J. (1999), Wining entry of the K. U. Leuven time-series prediction competition. International Journal of Bifurcation and Chaos, 9(8), 1485-1500. Meir, R. ()994), Bias, variance and the combination of estimators; the case of linear least squares. Department of Electrical Engineering, Technion, Haifa, Israel. R. L., Machado, R. J., and Renteria, R. P. (1999), Time-series forecasting through wavelets transformation and a mixture of expert models. Neurocomputing, 28(1-3), 145-156. Moody, J., and Darken, C. J. (1989), Fast learning in networks of locally-tuned processing units. Neural Computation, 1(2), 281-294. Moran, P. A. P. (1953), The statistical analysis of the Canadian Lynx cycle I: Structure and prediction. Australian Journal of Zoology, 1,163-173. Mukherjee, S., Osuna, E., and Girosi, F. (1997), Nonlinear prediction of chaotic time series using support vector machines. Proceeding of IEEE NNSP 97, 511-519. Muller, K., Smola, A., Ratsch, G., Scholkopf, B., Kohlmorgen, J., and Vapnik, V. (1997), Predicting time series with support vector machines. Proceedings of International Conference on Artificial Neural Networks (ICANN'9 7) , 999-1004. Muller, K., Smola, A., Misch, G., SchOlkopf, B., Kohlmorgen, J., and Vapnik, V. (1999), Using support vector machines for time series prediction. Advances in Kernel Methods Support Vector Learning, B. SchOlkopf, C. J. C. Burges, and A. J. Smola, eds., MIT Press, Cambridge, MA, 243-254. Muller, K., Mika, S., Ratsch, G., Tsuda, K., and SchOlkopf, B. (2001), An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks, 12(2), 181-201. Nadaraya, E. A. (1964), On estimating egression. Theory of Probability and its Applications, 9,141-142. Narendra, K., and Parthasarathy, K. (1990), Identification and control of dynamical systems using neural networks. IEEE Transactions on Neural Networks, 1(1), 4-27. Niyogi, P., and Girosi, F. (1996), On the relationship between generalization error, hypothesis complexity and sample complexity for radial basis functions. Neural Computation, 8,819-842. Ozaki, T. ()992), Identification of Nonlinearities and Non-Gaussinities in Time Series. New Direction in Time Seires Analysis, D. Brillinger, P. Gaines, J. Geweke, E. Parzen, M. Rosenblatt, and M. S. Taggu, eds., Springer-Verlag, New York, 227-264. Parlos, A., Rais, 0., and Atiya, A. (2000), Multi-step-ahead prediction in complex systems using dynamic recurrent neural networks. Neural Networks, 13(7), 765-786. Pawelzik, K., and Schuster, H. G. (1991), Unstable periodic orbits and prediction. Physical Review, A, 43(4), 1808-1812. Platt, J. (1991), A resource-allocating network for function interpolation. Neural Computation, 3(2), 213-255. Poggio, T., and Girosi, F. (1990), Regularization algorithms for learning that are equivalent to multilayer networks. Science, 247,978-982. Poincare, H. (1952), Science and method, New York, Dover. Poli, I., and Jones, R. D. (1994), A neural net model for prediction. Journal of American Statistical Association, 89(425), 117-121. Prechelt, L. (1994), PROBEN1–A set of benchmarks and benchmarking rules for neural network training algorithms. University of Karlsruhe, Germany. Priestley, M. B. (1965), Evolutionary spectral and non-stationary processes. Journal of the Royal Statistical Society, Series B(27), 204-237. Priestley, M. B., and Tong, H. (1973), On the analysis of bivariate non-stationary processes (with discussion). Journal of the Royal Statistical Society, Series B(35), 153– 166. Principe, J. C., and Wang, L. (1995), Non-linear time series modelling with selforganizing feature maps. 1995 Workshop: Neural Networks for Signal Processing V., 11-20. Puskorius, G. V., Feldkamp, L. A., and Davis, L. I., Jr. (1996), Dynamic neural network methods applied to on-vehicle idle speed control. Proceedings of the IEEE, 84(10), 1407-1420. Quandt, R. (1958), The estimation of parameters of a linear regression system obeying two separate regimes. Journal of the American Statistical Association, 53,873-880. Quandt, R. E. (1972), A new approach to estimating switching regressions. Journal of the American Statistical Association, 67,306-310. Quandt, R., and Ramsey, J. B. (1978), Estimating mixtures of normal distributions and switching regressions. Journal of the American Statistical Association, 73(364), 730– 752. Rabiner, L. R. (1989), A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257-286. Rabiner, L. R., and Juang, B. H. (1986), An introduction to hidden Markov models. IEEE Acoustics, Speech & Signal Processing Magazine, 3,4-16. Renals, S. (1989), Radial basis function network for speech pattern classification. Electronic Letters, 25,437-439. Rojas, I., Gonzalez, J., Canas, A., Diaz, A. F., Rojias, F. J., and Rodriguez , M. (2000), Short-term prediction of chaotic time series by using RBF network with regression weights. International Journal of Neural Systems, 10(5), 353-364. Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986a), Learning internal representations by error propagation. Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart and J. L. McClelland, eds., MIT Press, 318-362. Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986b), Learning representations by back-propagating errors. Nature, 323,533-536. Rumelhart, D. E., Durbin, R., Golden, R., and Chauvin, Y. (1995), Backpropagation: The basic theory. Backpropagation: Theory, Architectures, and Applications, Y. Chauvin and D. E. Rumelhart, eds., Lawrence Erlbaum Associates, N. J., 1-34. Rynkiewicz, J. (1999), Hybrid HMM/MLP models for time series prediction. Proceedings of the 7th European Symposium on Artificial Neural Networks, Bruges, Belgium, 455-462. Sanger, T. (1991), Tree-structured adaptive networks for function approximation in high-dimensional spaces. IEEE Transaction on Neural Networks, 2(2), 285-293. Sauer, T. (1994), Time series prediction by using delayed coordinate embedding. Time Series Prediction: Forecasting the Future and Understanding the Past, A. S. Weigend and N. A. Gershenfeld, eds., Addison & Wesley, MA, 175-193. Sauer, T., Yorke, J. A., and Casdagli, M. (1991), Embedology. Journal of Statistical Physics, 65(3/4), 579-616. Scholkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., and Vapnik, V. (1997), Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Transactions on Signal Processing, 45,2758-2765. Smola, A. J., and Scholkopf, B. (1998), A tutorial on support vector regression. Royal Holloway College, University of London, UK. Stollnitz, E. J., DeRose, T. D., and Salesin, D. H. (1995), Wavelets for computer graphics: A primer (part 1). IEEE Computer Graphics and Applications, 15(3), 76-84. Suykens, J. A. K., Huang, A., and Chua, L. 0. (1997), A family of n-scroll attractors from a generalized Chua's circuit. International Journal of Electronics and Communications, 51(3), 131-138. Suykens, J. A. K., and Vandewalle, J. (2000), The K.U. Leuven competition data: a challenge for advanced neural network techniques. Proceedings of European Symposium on Artificial Neural Networks (ESANN'2000), Bruges, Belgium, 299-304. Takens, F. (1980), Detecting strange attractors in turbulence. Proceedings of Symposium on Dynamical Systems and Turbulence, Lecture Notes in Mathematics, 366-381. Tang, Z., Almeida, C., and Fishwick, P. A. (1991), Time series forecasting using neural networks vs. Box-Jenkins methodology. Simulation, 57(5), 303-310, Tay, F. E. H., and Cao, L. (2001), Application of support vector machines in financial time series forecasting. Omega, 29(4), 309-317. Tay, F. E. H., and Cao, L. J. (2002), Modified support vector machines in financial time series forecasting. Neurocomputing, 48,847-861. Tong, H. (1990), Non-linear time series: A dynamical systems approach, Oxford University Press, Oxford. Tong, H. (2002), Nonlinear time series analysis since 1990: some personal reflections. Acta Mathematical Application Sinica, English Series, 18(2), 177-184. Tong, H., and Lim, K. S. (1980), Threshold autoregression, limit cycles and cyclical data. Journal of the Royal Statistical Society, B(42), 245-292. Trafalis, T. B., and Ince, H. (2000), Support vector machine for regression and applications to financial forecasting. Proceedings of International Conference on Neural networks (IJCNN2000), 348-353. Van Gestel, T., Suykens, J., Baestaens, D., Lambrechts, A., Lanckriet, G., Vandaele, B., De Moor, B., and Vandewalle, J. (2001), Financial time series prediction using least squares support vector machines within the evidence framework. IEEE Transactions on Neural Networks, Special Issue on Neural Networks in Financial Engineering, 12( 4), 809-821. Vapnik, V. N. (1992), Principles of risk minimization for learning theory. Advances in Neural Information Processing Systems 4, J. E. Moody, S. J. Hanson, and R. P. Lippmann, eds., Morgan Kaufmann Publishers, San Mateo, CA, 831-838. Vapnik, V. (1995), The nature of statistical learning theory, Springer, Berlin. Vapnik, V. N. (1998), Statistical learning theory, Wiley, New York. Vapnik, V. (1999), An overview of statistical learning theory. IEEE Transactions on Neural Networks, 10(5), 988-1000. Vert, J. P., Tsuda, K., and Scholkopf, B. (2004), A Primer on Kernel Methods. Kernel Methods in Computational Biology, MIT Press, 35-70. Vesanto, J. (1997), Using the SOM and local models in time-series prediction. Proc. of Workshop on Self-Organizing Maps, Helsinki University of Technology, 209-214. Wahba, G. (1990), Spline models for observational data, SIAM, Philadelphia. Walter, J., Ritter, H., and Schulten, K. (1990), Non-linear prediction with self organizing maps. Proceedings of International Joint Conference on Neural Networks IJCNN1990, 589-594. Wan, E. (1993), Modeling nonlinear dynamics with neural networks: examples in time series prediction. Proceedings of the Fifth Workshop on Neural Networks: Academic/ Industrial/NASA /Defense, WNN93/FNN93, San Francisco, 327-232. Wan, E. (1994), Time series prediction using a connectionist network with internal delay lines. Time Series Prediction: Forecasting the Future and Understanding the Past, A. Weigend and N. Gershenfeld, eds., Addison-Wesley, 195-218. Wang, X., Whigham, P., Deng, D., and Purvis, M. (2003), "Time-line" hidden Markov experts for time series prediction. Proceedings of IEEE International Conference on Neural Networks and Signal Processing (ICNNSP'03), 786-789. Wang, X., Whigham, P., Deng, D., and Purvis, M. (2004), "Time-line" hidden Markov experts for time series prediction. Neural Information Processing - Letters and Reviews, 3(2), 39-47. Watson, G. S. (1964), Smooth regression analysis. Sankhya - The Indian Journal of Statistics, 26,359-372. Weeks, E. R., Tian, Y., Urbach, J. S., Ide, K., Swinney, H. L., and Ghil, M. (1997), Transitions Between Blocked and Zonal Flows in a Rotating Annulus with Topography. Science, 278(5343), 1598-1601. Weigend, A. S., Huberman, B. A., and Rumelhart, D. E. (1990), Predicting the future: A connectionist approach. International Journal of Neural Systems, 1,193-209. Weigend, A. S., Rumelhart, D. E., and Huberman, B. A. ()991), Generalization by weight-elimination with application to forecasting. Advances in Neural Information Processing Systems 3, R. P. Lippmann, J. Moody, and D. S. Touretzky, eds., San Mateo, CA: Morgan Kaufmann, 875-882. Weigend, A. S., Huberman, B. A., and Rumelhart, D. E. (1992), Predicting sunspots and exchange rates with connectionist networks. Nonlinear Modeling and Forecasting, SF/ Studies in the Sciences of Complexity, M. Casdagli and S. Eubank, eds., Addison-Wesley, 395-432. Weigend, A. S., and Gershenfed, N. A. (1994a), The future of time series: learning and understanding. Time series prediction: forecasting the future and understanding the past, A. S. Weigend and N. A. Gershenfed, eds., Addison Wesley, 1-70. Weigend, A. S., and Gershenfed, N. A. (1994b), Time series prediction: forecasting the future and understanding the past. Addison Wesley. Weigend, A. S., Mangeas, M., and Srivastava, A. N. (1995), Nonlinear gated experts for time series: discovering regimes and avoiding overfitting. International Journal of Neural Systems, 6(4), 373-399. Weigend, A. S., and Shi, S. (2000), Predicting daily probability distributions of S&P500 returns. Journal of Forecasting, 19,375-392. Weiss, C. 0., and Klische, W. (1984), On observability of Lorenz instabilities in lasers. Optics Communications, 51(1), 47-48. Wettschereck, D., and Dietterich, T. G. (1992), Improving the performance of radial basis function networks by learning center locations. Advances in Neural Information Processing Systems 4, J. E. Moody, Hanson, S. J., & Lippmann, R. P., eds., Morgan Kaufmann, San Francisco, CA., 1133-1140. Widrow, G., and Hoff, M. E. (1960), Adaptive switching circuits. IRE WESCON Convention Record, 4,96-104. Wolberg, G. (1990), Digital image warping, IEEE Computer Society Press. Yang, H., Chan, L., and King, I. (2002), Support vector machine regression for volatile stock market prediction. Intelligent Data Engineering and Automated Learning (IDEAL'02) , 391-396. Yao, X., and Liu, Y. ()997), A new evolutionary system for evolving artificial neural networks. IEEE Transactions on Neural Networks, 8(3), 694-713. Yule, G. (1927), On a method of investigating periodicity in disturbed series with special reference to Wolfer's sunspot numbers. Philosophical Transactions of the Royal Society Series, 226A, 267-298. Zeevi, A., Meir, R., and Adler, R. J. (1997), Nonlinear models for time series using mixtures of experts. Faculty of Electrical Engineering, Technion, Haifa, Israel. Zhang, G. P. (2003), Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing, 50,159-175.en_NZ
 Find in your library

Files in this item


There are no files associated with this item.

This item is not available in full-text via OUR Archive.

If you would like to read this item, please apply for an inter-library loan from the University of Otago via your local library.

If you are the author of this item, please contact us if you wish to discuss making the full text publicly available.

This item appears in the following Collection(s)

Show simple item record