Skip to content

Fully Connected Neural Networks, Multilayer Neural Networks, MAdaline, CNNs, Segmentation, Detection, RNNs, CNN-LSTM, LSTM, Bi-LSTM, GRU, Transformers, Huber Loss, ViT, DGMs, Triplet VAE, AdvGAN, Image Caption Generation, attention, LLM Fine-Tuning, Soft Prompting, LoRA, Layer Freezing, SlimOrca

Notifications You must be signed in to change notification settings

MobinaMhr/Neural-Networks-and-Deep-Learning-Course-Projects-F2024

Repository files navigation

Neural-Networks-and-Deep-Learning-Course-Projects-S2025

Neural Networks and Deep Learning course projects in the spring semester of the University of Tehran under the supervision of Dr.Kalhor.


Implemented MLPs from scratch for classification and regression tasks, analyzing effects of hyperparameters, optimizers, and normalization.

Trained CNNs for skin cancer and leaf disease detection using transfer learning (NasNet, MobileNetV2, EfficientNetB6) and optimizer comparison.

Implemented VGG-UNet for brain tumor segmentation. Trained Faster R-CNN & SSD300 for traffic sign detection, comparing mAP, speed, and scale robustness.

Built CNN-LSTM for Persian spam detection using ParsBERT embeddings, and LSTM/GRU/Bi-LSTM models for crude oil price forecasting, benchmarking vs. ARIMA/SARIMA.

Built a Transformer for wind power forecasting, benchmarking against RNN and MLP baselines. Applied a ViT for White Blood Cell Image classification and fine-tuning analysis vs. DenseNet-121.

Developed a Triplet-VAE for unsupervised brain tumor detection on MRI data with Gated Cross-Skip decoding and multi-loss optimization. Implemented an AdvGAN to generate adversarial attacks on ResNet-20, evaluating fidelity and attack success.

Developed CNN–RNN and attention-based CNN–RNN Encoder–Decoder for image captioning on Flickr8k, evaluated via BLEU scores.

Fine-tuned the LLaMA-3.2-3B-Instruct model on a Persian dialogue dataset using parameter-efficient methods (Soft Prompting, LoRA) and partial full fine-tuning (unfreezing first and last layers) with HuggingFace Transformers and PEFT, improving instruction-following quality under limited compute.

About

Fully Connected Neural Networks, Multilayer Neural Networks, MAdaline, CNNs, Segmentation, Detection, RNNs, CNN-LSTM, LSTM, Bi-LSTM, GRU, Transformers, Huber Loss, ViT, DGMs, Triplet VAE, AdvGAN, Image Caption Generation, attention, LLM Fine-Tuning, Soft Prompting, LoRA, Layer Freezing, SlimOrca

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •