Black-box Optimization for Information Retrieval through Dynamic Parameter

Authors

  • V. Vinay Kumar  Assistant Professor, Department of CSE, Matrusri Engineering College, Saidabad, Telangana, India
  • K. Niharika chowdhary  Students, Department of CSE, Matrusri Engineering College, Saidabad, Telangana, India
  • P. Pavani Reddy  Students, Department of CSE, Matrusri Engineering College, Saidabad, Telangana, India
  • B. Varsha  Students, Department of CSE, Matrusri Engineering College, Saidabad, Telangana, India

Keywords:

Information Retrieval, Optimization, Parameter Estimation

Abstract

The retrieval function is a standout amongst the most critical parts of an Information Retrieval (IR) framework, since it decides to what degree some data is applicable to a user query. Most retrieval capacities have "free parameters" whose esteem must be set before retrieval, essentially influencing the viability of an IR framework. Picking the ideal esteems for such parameters is along these lines of foremost significance. Be that as it may, the ideal must be found after a computationally costly process, particularly when the speculation mistake is evaluated by means of cross-approval. In this paper, we propose to decide free parameter esteems by taking care of an improvement issue went for amplifying a measure of retrieval adequacy. We employ the black-box optimization paradigm, since the investigative articulation of the measure of viability concerning the free parameters is obscure. We consider distinctive strategies for taking care of the block box optimization issue: a basic network search over the entire area, and more complex systems, for example, line query and surrogate model based algorithms. Trial comes about on a few test accumulations give valuable understanding about viability, as well as about proficiency: they demonstrate that with proper optimization strategies, the computational cost of parameter improvement can be significantly lessened without trading off retrieval adequacy, notwithstanding when considering.

References

  1. Y. Ji, S. Kim, and W.X. Lu. A new framework for combining global and local methods in black box optimization. Optimization Online, paper 3977, 2013.
  2. M.E. Johnson, L. M. Moore, and D. Ylvisaker. Minimax and maximin distance designs. Journal of Statistical Planning and Inference, 26:131–148, 1990.
  3. D.R. Jones. A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization, 21:345–383, 2001.
  4. D.R. Jones, M. Schonlau, and W.J. Welch. Efficient global optimization of expensive black-box functions. Journal of Global Optimization, 13:445–492, 1998.
  5. J. Koehler and A. Owen. Computer experiments. In S. Ghosh and C.R. Rao, editors, Handbook of Statistics, 13: Design and Analysis of Experiments, pages 261–308, 1996.
  6. H.J. Kushner. A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. Journal of Basic Engineering, 86:97–106, 1964.
  7. S. Le Digabel. Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Transactions on Mathematical Software, 37(4):1–15, 2011.
  8. S. Le Digabel. Efficient global optimization algorithm assisted by multiple surrogate techniques. Journal of Global Optimization, 56(2):669–689, 2013.
  9. C. Lizon, C. D’Ambrosio, L. Liberti, M. Le Ravalec, and D. Sinoquet. A mixed-integer nonlinear optimization approach for well placement and geometry. Proceedings of ECMOR XIV - 14th European conference on the mathematics of oil retrieval, Catania, Italy, 2014.
  10. M.D. McKay, R.J. Beckman, and W.J. Conover. A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics, 21:239–245, 1979.
  11. J. Mockus. Application of bayesian approach to numerical methods of global and stochastic optimization. Journal of Global Optimization, 4:347–365, 1994.
  12. R. Stocki. A method to improve design reliability using optimal latin hypercube sampling. Computer Assisted Mechanics and Engineering Sciences, 12:393–412, 2005.
  13. V. Torczon. On the convergence of pattern search algorithms. SIAM Journal on Optimization, 7:1–25, 1997.
  14. E.L. Vazquez and J. Bect. Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. Journal of Statistical Planning and Inference, 140:3088–3095, 2010.
  15. FelipeA.C. Viana, RaphaelT. Haftka, and Jr. Steffen, Valder. Multiple surrogates: how crossvalidation errors can help us to obtain the best predictor. Structural and Multidisciplinary Optimization, 39(4):439–457, 2009.
  16. K.Q. Ye, W. Li, and A. Sudjianto. Algorithmic construction of optimal symmetric latin hypercube designs. Journal of Statistical Planning and Inference, 90:145–159, 2000.
  17. A. Zilinskas. A review of statistical models for global optimization. Journal of Global Optimization, 2:145–153, 1992.

Downloads

Published

2018-05-30

Issue

Section

Research Articles

How to Cite

[1]
V. Vinay Kumar, K. Niharika chowdhary, P. Pavani Reddy, B. Varsha, " Black-box Optimization for Information Retrieval through Dynamic Parameter , IInternational Journal of Scientific Research in Computer Science, Engineering and Information Technology(IJSRCSEIT), ISSN : 2456-3307, Volume 3, Issue 5, pp.82-87, May-June-2018.