Classification via Sixteen Optimized Deep Learning Models
Version 4.0.0 (154 KB) by
Hasan
Code related to our study that provides both binary and multiclass categorization using sixteen different optimized transfer learning models
With the aim of contributing to scientific research processes, we are sharing the code related to our study which provides both binary and multiclass categorization using sixteen different optimized transfer learning architectures.
Three different evaluation options are recommended for the shared code:
1) Use the code as-is and apply any of the sixteen optimized architectures directly to your own image dataset (for both binary and multi-class problems):
* ResNet50-CDW-PSO, ResNet50-ASPSO, ResNet50-MSGO, ResNet50-CSA,
* ResNet18-CDW-PSO, ResNet18-ASPSO, ResNet18-MSGO, ResNet18-CSA,
* MobileNetV2-CDW-PSO, MobileNetV2-ASPSO, MobileNetV2-MSGO, MobileNetV2-CSA,
* ShuffleNet-CDW-PSO, ShuffleNet-ASPSO, ShuffleNet-MSGO, ShuffleNet-CSA
2) You can adapt your own optimization technique to the code section where the optimization method is located (you can propose different optimized architectures with your own optimization algorithm, keeping the MobileNetV2, ResNet50, ResNet18, and ShuffleNet models fixed).
3) You can propose different optimized architectures by revising the TL models within the code (again, keeping the CDW-PSO, ASPSO, MSGO, and CSA methods fixed).
We hope you find it useful...
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
For the usage of these codes, you may cite the following article:
- Öcal, A., & Koyuncu, H. (2024). An in-depth study to fine-tune the hyperparameters of pre-trained transfer learning models with state-of-the-art optimization methods: Osteoarthritis severity classification with optimized architectures. Swarm and Evolutionary Computation, 89, 101640.
To honor the scientists that produced the aforementioned optimization methods, you may cite the following articles too concerning the utilized optimization algorithm:
MSGO:
- Naik, A., Satapathy, S. C., & Abraham, A. (2020). Modified Social Group Optimization—A meta-heuristic algorithm to solve short-term hydrothermal scheduling. Applied Soft Computing, 95, 106524.
ASPSO:
- Wang, R., Hao, K., Chen, L., Wang, T., & Jiang, C. (2021). A novel hybrid particle swarm optimization using adaptive strategy. Information Sciences, 579, 231-250.
CDW-PSO:
- Chen, K., Zhou, F., & Liu, A. (2018). Chaotic dynamic weight particle swarm optimization for numerical function optimization. Knowledge-Based Systems, 139, 23-40.
CSA:
- Talatahari, S., Azizi, M., Tolouei, M., Talatahari, B., & Sareh, P. (2021). Crystal structure algorithm (CryStAl): a metaheuristic optimization method. IEEE Access, 9, 71244-71261.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Related Article: [1] Öcal, A., & Koyuncu, H. (2024). An in-depth study to fine-tune the hyperparameters of pre-trained transfer learning models with state-of-the-art optimization methods: Osteoarthritis severity classification with optimized architectures. Swarm and Evolutionary Computation, 89, 101640.
Every folder includes different-optimized architectures that are:
- ResNet18-CDW-PSO
- ResNet18-ASPSO
- ResNet18-MSGO
- ResNet18-CSA
- ResNet50-CDW-PSO
- ResNet50-ASPSO
- ResNet50-MSGO
- ResNet50-CSA
- MobileNetV2-CDW-PSO
- MobileNetV2-ASPSO
- MobileNetV2-MSGO
- MobileNetV2-CSA
- ShuffleNet-CDW-PSO
- ShuffleNet-ASPSO
- ShuffleNet-MSGO
- ShuffleNet-CSA
In every folder;
- ‘result_gen.m’ reveals binary or multiclass categorization results.
- ‘range_arrangement.m’ fulfils the value assignment for a specific-discrete hyperparameter (transforms the continuous value obtained as the output of optimization method into the discrete value) [1].
- ‘preprocess_islemi.m’ is utilized to define the data in 3D.
- ‘param_adjust.m’ is concerned with the adjustments of six hyperparameters that are batch size, optimizer, learning rate, hidden layer number, epoch value, and hidden node number [1].
- ‘fit_fun.m’ generates error value according to the accuracy rate.
- ‘optimization_met.m’ produces the float numbers according to the hyperparameter requirements (can be also replaced with your own optimization method).
- ‘transferlearning_tool.m’ operates the defined transfer learning method to the dataset using specified hyperparameters assigned in the code (can be also replaced with other transfer learning models).
- ‘main_code.m’ includes the main part of code, and it was arranged according to the utilized optimization method.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Important Information:
- Before usage of the code, you may change the data path specified in ‘transferlearning_tool.m’ file according to the address of your data in your computer.
- In data folder, you may assign different labels (sub-folders). Herein, do not name those sub-folders as same.
- If transfer learning models are not defined in your MATLAB, you should download them from Mathworks Add-On Explorer.
- For the optimized TL architectures, computation time can vary from one model to another, and it sometimes takes long time to conclude. Please be patient :)
Cite As
Öcal, Aysun, and Hasan Koyuncu. “An in-Depth Study to Fine-Tune the Hyperparameters of Pre-Trained Transfer Learning Models with State-of-the-Art Optimization Methods: Osteoarthritis Severity Classification with Optimized Architectures.” Swarm and Evolutionary Computation, vol. 89, Aug. 2024, p. 101640, https://doi.org/10.1016/j.swevo.2024.101640.
MATLAB Release Compatibility
Created with
R2024a
Compatible with any release
Platform Compatibility
Windows macOS LinuxTags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
