A Study of the Application Domain of a Large Language Models in the Agricultural Sector

Authors

  • Saikat Banerjee State Aided College Teacher, Department of Computer Applications, Vivekananda Mahavidyalaya, Haripal, Hooghly, West Bengal, India
  • Soumitra Das Department of Computer Science, University of Burdwan, Golapbag, West Bengal, India
  • Abhoy Chand Mondal Professor & Head, Department of Computer Science, University of Burdwan, Golapbag, West Bengal, India

Keywords:

Artificial Intelligence (AI), Large Language Model, Agriculture, Natural Language Processing (NLP), Machine Learning

Abstract

Given the expanding global population and the increasing need for food, employing effective agricultural techniques to enhance productivity on finite land resources is imperative. Artificial Intelligence is increasingly widespread in agriculture, and Artificial Intelligence driven solutions enhance the existing farming system. Agricultural productivity relies on soil nutrient composition, moisture levels, crop rotation, precipitation, temperature, etc. Artificial intelligence-based products can utilize these characteristics to monitor agrarian productivity. Industries are increasingly adopting Artificial Intelligence technologies to enhance and streamline agricultural activities across the whole food supply chain. Agricultural applications and solutions utilizing artificial intelligence have been developed to support farmers in precise and controlled farming practices. These applications provide accurate guidance on water management, changing crops, timely harvesting, crop selection, optimal planting, pest control, and nutrition management. Artificial Intelligence enabled systems utilize data such as precipitation, wind speed, temperature, and sun radiation, together with images captured by satellites and drones, to compute weather forecasts, monitor the sustainability of agriculture, and evaluate farms for the existence of infectious illnesses, pests, or undernourished plants. A large language model is a form of artificial intelligence that employs deep learning techniques to analyse and comprehend natural language. It is trained on extensive text datasets to discern statistical correlations between words and phrases. Subsequently, it may produce text, translate material, and execute other natural language processing operations. This research demonstrates how large language models emphasize the agricultural industry.

References

A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, ?. Kaiser, and I. Polosukhin, "Attention is all you need," in Proc. 31st Int. Conf. Neural Information Processing Systems (NIPS), Curran Associates Inc., 2017, pp. 6000-6010. Available From: https://user.phil.hhu.de/~cwurm/wp-content/uploads/2020/01/7181-attention-is-all-you-need.pdf

T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, et al., "Language models are few-shot learners," in Advances in Neural Information Processing Systems, vol. 33, pp. 1877-1901, 2020. Available From: https://splab.sdu.edu.cn/GPT3.pdf

M. Shoeybi, M. Patwary, R. Puri, P. LeGresley, J. Casper, and B. Catanzaro, "Megatron-LM: Training multi-billion parameter language models using model parallelism," arXiv preprint arXiv:1909.08053, 2019. Available From : https://doi.org/10.48550/arXiv.1909.08053

A. Radford, J. Wu, R. Child, D. Luan, D. Amodei, and I. Sutskever, "Language models are unsupervised multitask learners," OpenAI Blog, vol. 1, no. 8, p. 9, 2019. Available From: https://insightcivic.s3.us-east-1.amazonaws.com/language-models.pdf

E. Strubell, A. Ganesh, and A. McCallum, "Energy and policy considerations for deep learning in NLP," arXiv preprint arXiv:1906.02243, 2019. Available From: https://doi.org/10.1609/aaai.v34i09.7123

G. Thakkar, R. de Penning, Y. Bhagwat, S. Mehendale, and P. Joshi, "Fairness and bias in natural language processing," in Proc. 29th Int. Conf. Computational Linguistics, 2021, pp. 4665-4677.

A. Ramesh, M. Pavlov, G. Goh, S. Gray, C. Voss, A. Radford, et al., "Zero-shot text-to-image generation," in Proc. 38th Int. Conf. Machine Learning, 2021. Available From: https://proceedings.mlr.press/v139/ramesh21a.html?ref=journey

J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of deep bidirectional transformers for language understanding," in Proc. 2019 Conf. North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol. 1 (Long and Short Papers), 2019, pp. 4171-4186.

Deep-Seek-Coder-V2-Instruct-0724. Available from: https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-Instruct-0724.

Y. LeCun, Y. Bengio, and G. Haffner, "Gradient-based learning applied to document recognition," Proc. IEEE, vol. 86, no. 11, pp. 2278-2324, Nov. 1998. Available From: https://doi.org/10.1109/5.726791

A. Krizhevsky, I. Sutskever, and G. H. Hinton, "ImageNet classification with deep convolutional neural networks," in Advances in Neural Information Processing Systems, vol. 25, pp. 1097-1105, 2012. Available From: https://proceedings.neurips.cc/paper/2012/hash/c399862d3b9d6b76c8436e924a68c45b-Abstract.html

K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," in Proc. IEEE Conf. Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 2016, pp. 770-778. Available From: https://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html

T. Mikolov, K. Chen, G. Corrado, and J. Dean, "Distributed representations of words and phrases and their compositionality," in Proc. 26th Annu. Int. Conf. Neural Information Processing Systems, Lake Tahoe, NV, USA, 2013, pp. 3111-3119. Available From: https://proceedings.neurips.cc/paper/2013/hash/9aa42b31882ec039965f3c4923ce901b-Abstract.html

D. P. Kingma and M. Welling, "Auto-Encoding variational Bayes," in Proc. 2nd Int. Conf. Learning Representations (ICLR), Banff, Canada, 2014. Available From: https://www.ee.bgu.ac.il/~rrtammy/DNN/StudentPresentations/2018/AUTOEN~2.PDF

G. E. Dahl, T. N. Sainath, and G. E. Hinton, "Improving deep neural networks for LVCSR using rectified linear units and dropout," in Proc. IEEE Int. Conf. Acoustics, Speech and Signal Processing (ICASSP), Vancouver, BC, Canada, 2013, pp. 8609-8613. Available From: https://doi.org/10.1109/ICASSP.2013.6639346

Downloads

Published

2024-10-02

How to Cite

[1]
S. Banerjee, S. Das, and A. C. Mondal, “A Study of the Application Domain of a Large Language Models in the Agricultural Sector”, IJIRCST, vol. 12, no. 5, pp. 74–78, Oct. 2024.

Similar Articles

<< < 2 3 4 5 6 7 

You may also start an advanced similarity search for this article.