Hello, It's Me

Dr. M. Usman Ghani Khan

Chairman and Professor, Department of Computer Science University of Engineering & Technology, Lahore

I have over 20 years of Research Experience
in
Download Resume

About Me

250 +

Publications

50 +

Research Grants

40 +

Foreign Experties

500 +

Trained Professors

I have over 20 years of Research Experience in

I am the Chairman of the Computer Science Department at UET Lahore and the Director of the Intelligent Criminology Research Lab under the National Center of Artificial Intelligence at the Al-Khwarizmi Institute of Computer Science (KICS), UET.

I have founded and directed five research labs at KICS, UET Lahore, including the Computer Vision & Machine Learning Lab, Bioinformatics Lab, Virtual Reality & Gaming Lab, Data Science Lab, and Software Systems Research Lab. As a dedicated staff and mentor, I specialize in subjects related to Artificial Intelligence, Machine Learning, and Deep Learning. I have also recorded freely available video lectures on YouTube for courses in Bioinformatics, Image Processing, Data Mining & Data Science, and Computer Programming.

Download Resume More About Me

Citations per year

Citations: 4731

h-index: 36

i10-index: 98

Top 10 Research Papers

1

A deep learning approach for automated diagnosis and multi-class classification of Alzheimer’s disease stages using resting-state fMRI and residual neural networks

2

A realistic image generation of face from text description using the fully trained generative adversarial networks

3

Brain tumor segmentation in multi‐spectral MRI using convolutional neural networks (CNN)

4

Deep unified model for face recognition based on convolution neural network and edge computing

5

Deep learning model integrating features and novel classifiers fusion for brain tumor segmentation

6

Computer-assisted brain tumor type discrimination using magnetic resonance imaging features

7

Soft computing-based EEG classification by optimal feature selection and neural networks

8

Technologies and challenges in developing Machine-to-Machine applications: A survey

9

Microscopic abnormality classification of cardiac murmurs using ANFIS and HMM

10

GHS-NET a generic hybridized shallow neural network for multi-label biomedical text classification

Tool & Languages

C
100%
C Language
C++ Language
100%
C++ Language
C Sharp
100%
C Sharp
Java Language
100%
Java Language
JavaScript
100%
JavaScript
Python
100%
Python
Qt
100%
Qt
HTML 5
100%
HTML 5
CSS 3
100%
CSS 3
Bootstrap
100%
Bootstrap
React
100%
React
Microsoft SQL Server
100%
Microsoft SQL Server
SQLite
100%
SQLite
Oracle
100%
Oracle
MySQL
100%
MySQL
MongoDB
100%
MongoDB
PostgreeSQL
100%
PostgreeSQL
Firebase
100%
Firebase
Heroku
100%
Heroku
Amazon Web Services
100%
Amazon Web Services
Docker
100%
Docker
Google CLoud
100%
Google CLoud
Mircosoft Azure
100%
Mircosoft Azure
Django
100%
Django
Pocoo Flask
100%
Pocoo Flask
MATLAB
100%
Unknown
Postman
100%
Postman
Adobe Photoshop
100%
Adobe Photoshop
Scikit-Learn
100%
Scikit-Learn
seaborn
100%
seaborn
OpenCV
100%
OpenCV
Pandas
100%
Pandas
Pytorch
100%
Pytorch
TensorFlow
100%
TensorFlow
GitSCM
100%
GitSCM
Linux
100%
Linux

Research Papers

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Shams, S., Sadia, B. and Aslam, M Haseeb ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of ICOSST 2022, Lahore, Pakistan.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 


Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Khan, E., Rauf S., Adeeba F. and Hussain ( "Intent Detection in Urdu Queries using Fine-tuned BERT models" ) in the Proceedings of The O-COCOSDA 2021, Singapore.

Abstract— User’s intent detection provides essential cues in query understanding and accurate information retrieval through search engines and task-oriented dialogue systems. Intent detection from user queries is challenging due to short query length and lack of sufficient context. Further, limited prior research in query intent detection has been conducted for Urdu, an under-resourced language. With the recent success of Bidirectional Encoder Representation from Transformers (BERT), that provides pre-trained language models, we propose to develop intent detection model for Urdu by fine-tuning BERT variants for intent detection task. We conduct rigorous experimentation on mono and cross-lingual transfer learning approaches by using pre-trained BERT models i.e. mBERT, ArBERT, and roBERTa-urdu-small and two query datasets. Experimental evaluation reveal that the fine-tuned models of mBERT and roBERTa-urdu-small achieve 96.38% and 93.30% accuracy respectively on datasets I and II outperforming strong statistical and neural network baselines. 

Our Staff

Javeria Khan Team Lead BRL Lab
Ahmad Hassan Team Lead CVML Lab
M Nauman Hanif Team Lead IDL Lab
M Usman Ghani Khan Director