Please use this identifier to cite or link to this item:
http://dx.doi.org/10.25673/112995
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Velichkovska, Bojana | - |
dc.contributor.author | Petrushevskav, Sandra | - |
dc.contributor.author | Runcheva, Bsiera | - |
dc.contributor.author | Kalendar, Marija | - |
dc.date.accessioned | 2024-01-10T09:02:22Z | - |
dc.date.available | 2024-01-10T09:02:22Z | - |
dc.date.issued | 2023 | - |
dc.identifier.uri | https://opendata.uni-halle.de//handle/1981185920/114952 | - |
dc.identifier.uri | http://dx.doi.org/10.25673/112995 | - |
dc.identifier.uri | http://dx.doi.org/10.25673/112995 | - |
dc.description.abstract | Numerous studies have detailed instances of demographic bias in medical data and artificial intelligence (AI) systems used in medical setting. Moreover, these studies have also shown how these biases can significantly impact the access to and quality of care, as well as quality of life for patients belonging in certain under-represented groups. These groups are then being marginalised because of stigma based on demographic information such as race, gender, age, ability, and so on. Since the performance of AI models is highly dependent on the quality of data used to train the algorithms, it is a necessary precaution to analyse any potential bias inadvertently existent in the data, in order to mitigate the consequences of using biased data in creating medical AI systems. For that reason, we propose a machine learning (ML) analysis which receives patient biosignals as input information and analyses them for two types of demographic bias, namely gender and age bias. The analysis is performed using several ML algorithms (Logistic Regression, Decision Trees, Random Forest, and XGBoost). The trained models are evaluated with a holdout technique and by observing the confusion matrixes and the classification reports. The results show that the models are capable of detecting bias in data. This makes the proposed approach one way to identify bias in data, especially throughout the process of building AI-based medical systems. Consequently, the proposed pipeline can be used as a mitigation technique for bias analysis in data. | - |
dc.language.iso | eng | - |
dc.rights.uri | https://creativecommons.org/licenses/by-sa/4.0/ | - |
dc.subject | Artificial Intelligence | - |
dc.subject | Machine Learning | - |
dc.subject | Gender Bias | - |
dc.subject.ddc | 006.3 | - |
dc.title | Demographic bias in medical datasets for clinical AI | - |
local.versionType | publishedVersion | - |
local.publisher.universityOrInstitution | Hochschule Anhalt | - |
local.openaccess | true | - |
dc.identifier.ppn | 1873187793 | - |
cbs.publication.displayform | 2023 | - |
local.bibliographicCitation.year | 2023 | - |
cbs.sru.importDate | 2024-01-10T09:01:00Z | - |
local.bibliographicCitation | Enthalten in Proceedings of the 11th International Conference on Applied Innovations in IT - Köthen, Germany : Edition Hochschule Anhalt, 2023 | - |
local.accessrights.dnb | free | - |
Appears in Collections: | International Conference on Applied Innovations in IT (ICAIIT) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2_5_ICAIIT_Paper_2023(2)_Velichkovska_30-1.pdf | 939.34 kB | Adobe PDF | View/Open |