Introduction to open source:
When I first began working on NeuroLeveL, the most I thought it would do was maybe help 20 bots (atmost) in adopting dynamic xp leveling or just add another layer of dynamism. But as of now, nearly a month after its release it has been cloned by nearly 250+ people on github (I can't pinpoint the exact number as github doesn't show lifetime traffic *at least to my knowledge). So I will not be incorrect to say that NeuroLeveL might've been a needed building block which people seem to be enjoying working with and I really look forward to what happens of this henceforth!
But as of now (again a month later) it has rarely gotten any new changes or additions or even work in general, which is true, and I wasn't actually aware about how much maintaining and updating actually goes into such projects. Well moving onto why it hasnt been getting updates:
As I recently entered this "Developer's" landscape, I have been pretty diligently noticing that amidst all the coding and programming which goes into a project, there is more often a lack of academia (research basis) behind projects (at least in my experience which I would admit isnt much), which is its own topic I would like to dwell in some other time. So, on a curious limb, I wanted to walk into this so called 'Ivory Tower ', not only to see what these academics do but also to understand what leads to this lapse in communication. So, approximately two years ago, I decided to dive headfirst into the world of academia.
(March-June/July 2024):
I was always curious about what AI actually stood for and in reality, how it worked. Hence, I decided to begin reading the foundational papers behind AI. Now though it seems magical for most to just type something and get a response, the foundation behind them remains consistent from nearly 75+ years ago. So, I began with the famous imitation game[1], moved onto the coining of 'AI' followed by the first perceptron[2, 3], leading to gradient studies which spiraled to gradient descent and back-propagation[4, 5, 6 ,7]. Then moving away from the core of AI I divulged my interests further into RNNs[8] and CNNs[9] going in depth to eigen values and object detection while covering topics like LSTMs, And memory storage in RNNs. This then diverted to Fuzzy logic[10] and fuzzy membership functions[11] covering the base of Fuzzy theory building upon set theory. Finally, this madness ended with statistical techniques like PCA[12],K-mean clustering, SVMs[13, 14, 15] etc... (This isn't to say that I know a lot, but to state that I had covered a lot of material).
(August-December 2024):
Like every other student, in an attempt to actually remember what I've studied I wrote down everything, documented everything in depth. However, given the volume of all of the topics which I read, it seemed a shame to keep this collection of knowledge just to myself. So, I did the thing I was doing for the last year of so...
I combined all of the text to form a comprehensive summary or rather an Intro to Artificial Intelligence. (Essentially forming my own literature review of sorts!)
(March-July 2025):
So, in the time since my last post, I have been busy compiling all these papers in a more structured manner. This led to the development of three papers each tackling a key axis of AI, from basics of AI to statistical analysis using the already mentioned techniques and finally ending with an intro to optimizers! And finally, I am pretty proud to share that one of these papers in available to be checked out on TechRxiv. If you want to understand optimizers or just want to read up on Optimizers and how they work, head onto TechRxiv and search up: Training Neural Networks: Introduction to Optimizers,or follow this link: doi.org/10.36227/techrxiv.175372463.38769899/v1
And as for the NeuroLeveL framework, it is far from complete, and I have my own plans for it, so expect some changes in the upcoming months...
Until then... may those semi-colons favor you!
Sources:
- Turing, Alan Mathison. "Mind." Mind 59.236 (1950): 433-460.
- McCarthy, John, et al. "A proposal for the Dartmouth summer research project on artificial intelligence, august 31, 1955." AI magazine 27.4 (2006): 12-12.
- Rosenblatt, Frank. "The perceptron: a probabilistic model for information storage and organization in the brain." Psychological review 65.6 (1958): 386.
- Burgess, John C. "Adaptive Signal Processing edited by Bernard Widrow and Samuel D. Stearns." (1986): 991-992.
- Cioffi, John, and Thomas Kailath. "Fast, recursive-least squares transversal filters for adaptive filtering." IEEE Transactions on Acoustics, Speech, and Signal Processing 32.2 (1984): 304-337.
- Mandic, Danilo P. "A generalized normalized gradient descent algorithm." IEEE signal processing letters 11.2 (2004): 115-118.
- Mathews, V. John, and Zhenhua Xie. "A stochastic gradient adaptive filter with gradient adaptive step size." IEEE transactions on Signal Processing 41.6 (1993): 2075-2087.
- Hochreiter, Sepp, and Jürgen Schmidhuber. "LSTM can solve hard long time lag problems." Advances in neural information processing systems 9 (1996).
- Zhiqiang, Wang, and Liu Jun. "A review of object detection based on convolutional neural network." 2017 36th Chinese control conference (CCC). IEEE, 2017.
- Dernoncourt, Franck. "Introduction to fuzzy logic." Massachusetts Institute of Technology 21 (2013): 50-56.
- Thaker, Shaily, and Viral Nagori. "Analysis of fuzzification process in fuzzy expert system." Procedia computer science 132 (2018): 1308-1316.
- Pearson, Karl. "Principal components analysis." The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 6.2 (1901): 559.
- Boswell, Dustin. "Introduction to support vector machines."Departement of Computer Science and Engineering University of California San Diego 11 (2002): 16-17.
- Guo, Shanshan, Shiyu Chen, and Yanjie Li. "Face recognition based on convolutional neural network and support vector machine." 2016 IEEE International conference on Information and Automation (ICIA). IEEE, 2016
(For those interested, the total list consists of 70 papers here's all of them in order, i won't be specifying which ones are for what, as it has already taken eons to write and compile all the documents for this blog!):
- Turing, Alan Mathison. "Mind." Mind 59.236 (1950): 433
- McCarthy, John, et al. "A proposal for the Dartmouth summer research project on artificial intelligence, august 31, 1955." AI magazine 27.4 (2006): 12-12.
- Buchanan, Bruce G. "A (very) brief history of artificial intelligence." Ai Magazine 26.4 (2005): 53-53.
- Rosenblatt, Frank. "The perceptron: a probabilistic model for information storage and organization in the brain." Psychological review 65.6 (1958): 386.
- Hsu, Feng-hsiung. "IBM's deep blue chess grandmaster chips." IEEE micro 19.2 (1999): 70-81.
- Wu, Yu-Chen, and Jun-wen Feng. "Development and application of artificial neural network." Wireless Personal Communications 102 (2018): 1645-1656.
- Adam, Martin, Michael Wessel, and Alexander Benlian. "AI based chatbots in customer service and their effects on user compliance." Electronic Markets 31.2 (2021): 427-445.
- Mokhtari, Sohrab, Kang K. Yen, and Jin Liu. "Effectiveness of artificial intelligence in stock market prediction based on machine learning." arXiv preprint arXiv:2107.01031 (2021).
- Dewitte, Steven, et al. "Artificial intelligence revolutionizes weather forecast, climate monitoring and decadal prediction." Remote Sensing 13.16 (2021): 3209.
- Mahesh, Batta. "Machine learning algorithms-a review." International Journal of Science and Research (IJSR).[Internet] 9.1 (2020): 381-386.
- De Ville, Barry. "Decision trees." Wiley Interdisciplinary Reviews: Computational Statistics 5.6 (2013): 448-455.
- Boser, Bernhard E., Isabelle M. Guyon, and Vladimir N. Vapnik. "A training algorithm for optimal margin classifiers." Proceedings of the fifth annual workshop on Computational learning theory. 1992.
- Boswell, Dustin. "Introduction to support vector machines." Departement of Computer Science and Engineering University of California San Diego 11 (2002): 16-17.
- Vapnik, Vladimir. The nature of statistical learning theory. Springer science & business media, 2013.
- Vapnik, Vladimir N., and A. Ya Chervonenkis. "On the uniform convergence of relative frequencies of events to their probabilities." Measures of complexity: festschrift for alexey chervonenkis. Cham: Springer International Publishing, 2015. 11-30.
- Lloyd, Stuart. "Least squares quantization in PCM." IEEE transactions on information theory 28.2 (1982): 129-137.
- Pearson, Karl. "Principal components analysis." The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 6.2 (1901): 559.
- Abdi, Hervé, and Lynne J. Williams. "Principal component analysis." Wiley interdisciplinary reviews: computational statistics 2.4 (2010): 433-459.
- Christin, Sylvain, Éric Hervet, and Nicolas Lecomte. "Applications for deep learning in ecology." Methods in Ecology and Evolution 10.10 (2019): 1632-1644.
- Mamoshina, Polina, et al. "Applications of deep learning in biomedicine." Molecular pharmaceutics 13.5 (2016): 1445
- Dongare, A. D., R. R. Kharde, and Amit D. Kachare. "Introduction to artificial neural network." International Journal of Engineering and Innovative Technology (IJEIT) 2.1 (2012): 189-194.
- Burgess, John C. "Adaptive Signal Processing edited by Bernard Widrow and Samuel D. Stearns." (1986): 991-992.
- Cioffi, John, and Thomas Kailath. "Fast, recursive-least squares transversal filters for adaptive filtering." IEEE Transactions on Acoustics, Speech, and Signal Processing 32.2 (1984): 304-337.
- Mandic, Danilo P. "A generalized normalized gradient descent algorithm." IEEE signal processing letters 11.2 (2004): 115-118.
- Mathews, V. John, and Zhenhua Xie. "A stochastic gradient adaptive filter with gradient adaptive step size." IEEE transactions on Signal Processing 41.6 (1993): 2075-2087.
- Hochreiter, Sepp, and Jürgen Schmidhuber. "LSTM can solve hard long time lag problems." Advances in neural information processing systems 9 (1996).
- Dernoncourt, Franck. "Introduction to fuzzy logic." Massachusetts Institute of Technology 21 (2013): 50-56.
- Thaker, Shaily, and Viral Nagori. "Analysis of fuzzification process in fuzzy expert system." Procedia computer science 132 (2018): 1308-1316.
- Van Leekwijck, Werner, and Etienne E. Kerre. "Defuzzification: criteria and classification." Fuzzy sets and systems 108.2 (1999): 159-178.
- Manning, Christopher D., et al. "The Stanford CoreNLP natural language processing toolkit." Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations. 2014.
- Babu, Nirmal Varghese, and E. Grace Mary Kanaga. "Sentiment analysis in social media data for depression detection using artificial intelligence: a review." SN computer science 3.1 (2022): 74.
- Wiley, Victor, and Thomas Lucas. "Computer vision and image processing: a paper review." International Journal of Artificial Intelligence Research 2.1 (2018): 29-36.
- Idrees, Haroon, Mubarak Shah, and Ray Surette. "Enhancing camera surveillance using computer vision: a research note." Policing: An International Journal 41.2 (2018): 292307.
- Turk, Matthew, and Alex Pentland. "Eigenfaces for recognition." Journal of cognitive neuroscience 3.1 (1991): 71-86.
- Wang, Yi-Qing. "An analysis of the Viola-Jones face detection algorithm." Image Processing On Line 4 (2014): 128-148.
- Nafchi, Hossein Ziaei, and Seyed Morteza Ayatollahi. "A set of criteria for face detection preprocessing." Procedia Computer Science 13 (2012): 162-170.
- Saravanan, Chandran. "Color image to grayscale image conversion." 2010 second international conference on computer engineering and applications. Vol. 2. IEEE, 2010.
- Ghassabeh, Youness Aliyari, Frank Rudzicz, and Hamid Abrishami Moghaddam. "Fast incremental LDA feature extraction." Pattern Recognition 48.6 (2015): 1999-2012.
- Hoyer, Patrik O., and Aapo Hyvärinen. "Independent component analysis applied to feature extraction from color and stereo images." Network: computation in neural systems 11.3 (2000): 191-210.
- Guo, Shanshan, Shiyu Chen, and Yanjie Li. "Face recognition based on convolutional neural network and support vector machine." 2016 IEEE International conference on Information and Automation (ICIA). IEEE, 2016
- Sirovich, Lawrence, and Michael Kirby. "Low-dimensional procedure for the characterization of human faces." Josa a 4.3 (1987): 519-524.
- Zhiqiang, Wang, and Liu Jun. "A review of object detection based on convolutional neural network." 2017 36th Chinese control conference (CCC). IEEE, 2017.
- Felzenszwalb, Pedro F., and Daniel P. Huttenlocher. "Efficient graph-based image segmentation." International journal of computer vision 59 (2004): 167-181.
- Sun, Degang, et al. "A scale balanced loss for bounding box regression." IEEE Access 8 (2020): 108438-108448.
- Jiang, Huaizu, and Erik Learned-Miller. "Face detection with the faster R-CNN." 2017 12th IEEE international conference on automatic face & gesture recognition (FG 2017). IEEE,
- He, Kaiming, et al. "Spatial pyramid pooling in deep convolutional networks for visual recognition." IEEE transactions on pattern analysis intelligence 37.9 (2015): 1904-1916. and machine
- Tzotsos, Angelos, and Demetre Argialas. "Support vector machine classification for object-based image analysis." Object-based image analysis: Spatial concepts for knowledge-driven remote sensing applications (2008): 663-677.
- A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, and S. Thrun, “Dermatologist-level classification of skin cancer with deep neural networks,” Nature, vol. 542, no. 7639, pp. 115–118, 2017.
- G. Litjens, T. Kooi, B. E. Bejnordi, A. A. A. Setio, F. Ciompi, M. Ghafoorian, J. A. W. M. van der Laak, B. van Ginneken, and C. I. Sánchez, “A survey on deep learning in medical image analysis,” Med. Image Anal., vol. 42, pp. 60–88, 2017.
- R. Miotto, F. Wang, S. Wang, X. Jiang, and J. T. Dudley, “Deep learning for healthcare: Review, opportunities and challenges,” Briefings in Bioinformatics, vol. 19, no. 6, pp. 1236–1246, 2018.
- K. Kamilaris and F. X. Prenafeta-Boldú, “Deep learning in agriculture: A survey,” Comput. Electron. Agric., vol. 147, pp. 70–90, 2018.
- K. G. Liakos, P. Busato, D. Moshou, S. Pearson, and D. Bochtis, “Machine learning in agriculture: A review,” Sensors, vol. 18, no. 8, p. 2674, 2018.
- J. West and M. Bhattacharya, “Intelligent financial fraud detection: A comprehensive review,” Comput. Secur., vol. 57, pp. 47–66, 2016.
- H. Zhang, R. Zhang, Y. Qian, and T. Mei, “Stock movement prediction with adversarial training,” in Proc. 26th ACM Int. Conf. Multimedia, pp. 1179–1187, 2018.
- J. Lee, B. Bagheri, and H.-A. Kao, “A cyber-physical systems architecture for industry 4.0-based manufacturing systems,” Manuf. Lett., vol. 3, pp. 18–23, 2015.
- O. Zawacki-Richter, M. Marín, K. Bond, and F. Gouverneur, “Systematic review of research on artificial intelligence applications in higher education – where are the educators?” Int. J. Educ. Technol. High. Educ., vol. 16, p. 39, 2019.
- M. E. Norouzzadeh, A. Nguyen, M. Kosmala, A. Swanson, M. Palmer, C. Packer, and J. Clune, “Automatically identifying animals in a camera trap image with deep learning,” Proc. Natl. Acad. Sci. U.S.A., vol. 115, no. 25, pp. E5716–E5725, 2018.
- M. Bojarski, D. D. Testa, D. Dworakowski, B. Firner, B. Flepp, P. Goyal, et al., “End to end learning for self-driving cars,” arXiv preprint, arXiv:1604.07316, 2016.
- L. Zhang, S. Wang, and B. Liu, “Deep learning for sentiment analysis: A survey,” Wiley Interdiscip. Rev. Data Min. Knowl. Discov., vol. 8, no. 4, p. e1253, 2018.
- Liu, B. (2015). Sentiment analysis: Mining opinions, sentiments, and emotions. Cambridge University Press
- D. Gunning, “Explainable artificial intelligence (XAI),” DARPA, 2017.
- H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” arXiv preprint, arXiv:1602.05629, 2017.
- S. Gade, "Explainable Artificial Intelligence (XAI): A Brief Overview," Int. J. Comput. Appl., vol. 182, no. 31, pp. 1–5,
- S. M. Lundberg and S.-I. Lee, "A Unified Approach to Interpreting Model Predictions," in Advances in Neural Information Processing Systems (NeurIPS), 2017.
- M. T. Ribeiro, S. Singh, and C. Guestrin, "Why Should I Trust You? Explaining the Predictions of Any Classifier," in Proc. 22nd ACM SIGKDD Int. Conf. Knowl. Discov. Data Min., 2016, pp. 1135–1144.
- M. Sundararajan, A. Taly, and Q. Yan, "Axiomatic Attribution for Deep Networks," in Proc. Int. Conf. Mach. Learn. (ICML), 2017.
- H. B. McMahan et al., "Communication-Efficient Learning of Deep Networks from Decentralized Data," in Proc. 20th Int. Conf. Artif. Intell. Stat. (AISTATS), 2017, pp. 1273
- A. Hard et al., "Federated Learning for Mobile Keyboard Prediction," arXiv preprint arXiv:1811.03604, 2018.
- K. Bonawitz et al., "Towards Federated Learning at Scale: System Design," in Proc. 2nd SysML Conf., 2019.
- P. Kairouz et al., "Advances and Open Problems in Federated Learning," Found. Trends Mach. Learn., vol. 14, no. 1–2, pp. 1–210, 2021.
Top comments (0)