Chairman and Professor, Department of Computer Science University of Engineering & Technology, Lahore
Publications
Research Grants
Foreign Experties
Trained Professors
I am the Chairman of the Computer Science Department at UET Lahore and the Director of the Intelligent Criminology Research Lab under the National Center of Artificial Intelligence at the Al-Khwarizmi Institute of Computer Science (KICS), UET.
I have founded and directed five research labs at KICS, UET Lahore, including the Computer Vision & Machine Learning Lab, Bioinformatics Lab, Virtual Reality & Gaming Lab, Data Science Lab, and Software Systems Research Lab. As a dedicated staff and mentor, I specialize in subjects related to Artificial Intelligence, Machine Learning, and Deep Learning. I have also recorded freely available video lectures on YouTube for courses in Bioinformatics, Image Processing, Data Mining & Data Science, and Computer Programming.
Citations: 4731
h-index: 36
i10-index: 98
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Shams, S., Sadia, B. and Aslam, M Haseeb ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of ICOSST 2022, Lahore, Pakistan.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.
Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.
Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines.