FairKM can come handy across a number of data scenarios where AI has a role to play during decision makings, such as during policing for the detection of suspicious activities and crime prevention.
Indian Institute of Technology Madras (IIT-Madras) and Queen's University Belfast in the UK have advanced a new technology to make Artificial Intelligence just. The researchers from both the institutes have collaborated on the project to make AI fairer and less biased during the process of processing data. FairKM, the fair clustering algorithm, can help with infinite number of quantified sensitive characteristics. This leads to an improved fairer procedure. To ensure fairness in shortlisting, FairKM takes a noteworthy step towards algorithms. In terms of human resources, its contribution can be immense. FairKM can come handy across a number of data scenarios where AI has a role to play during decision makings, such as during policing for detection of suspicious activities and crime prevention.
AI technologies are often put to use by companies to sort through big amounts of data in circumstances such as in policing when there is a large volume of CCTV data linked to a crime or when there is an oversubscribed job vacancy. For the first time, a research team has developed a method that can attain fairness in many features. It holds immense importance in developing countries such as India. A whole lot of countries undergo major social and economic discrepancies and these are visible in the data. Applying AI techniques directly on raw data results in biased insights, which impact public policy and this could amplify existing disparities.
The research work is scheduled to present in Copenhagen in April 2020 at the EDBT 2020 conference in Denmark.
Follow Shiksha.com for latest education news in detail on Exam Results, Dates, Admit Cards, & Schedules, Colleges & Universities news related to Admissions & Courses, Board exams, Scholarships, Careers, Education Events, New education policies & Regulations.
To get in touch with Shiksha news team, please write to us at firstname.lastname@example.org