A UK-based community of impartial ladies’ colleges has held its fifth annual techathon, gathering greater than 100 pupils from 25 colleges to work on initiatives utilizing artificial intelligence (AI) for social good.
The Women’ Day College Belief (GDST), which consists of 25 impartial ladies’ colleges and 19,000 pupils from throughout the nation, runs the techathon to encourage and encourage extra younger girls to pursue careers in know-how.
Solely 15% of individuals working in engineering and programming roles are girls, in accordance with a report by the Tech Expertise Constitution, whereas a latest survey by Kaspersky Lab discovered that the gender imbalance meant over a 3rd of women in the tech sector felt uncomfortable when embarking on their careers in IT.
Amy Icke, on-line studying and innovation supervisor at GDST, cited numerous causes for girls being so under-represented within the know-how sector.
“Partly you can trace that back to the education system,” she stated. “In co-education environments in particular, girls often feel under-confident in those settings. Add to that things like societal pressure.”
To fight this, the GDST invitations feminine mentors and panellists to the techathon to assist college students and reply questions, highlighting the significance of getting female role models.
Icke added that, inside AI particularly, there was an moral significance in getting girls concerned.
“Traditionally, a lot of our systems have been developed by male developers. I don’t want to stereotype that women will instinctively think more ethically or be more risk averse, but I think there is something about having different voices around the table that means different opinions are represented,” she stated.
“We need to make sure we have good datasets to [help us make] informed decisions.”
Inventing AI instruments with a social profit
Tasked with making a services or products that makes use of AI for social good, the attending college students’ innovations have been judged in opposition to their enterprise credibility, design creativity, presentation high quality and originality.
The AI merchandise developed by the scholars, all aged between 11 and 18, ranged from instruments that monitor micro-expressions to guage if an individual is mendacity, to anxiety-relief chatbots and bullying detection algorithms.
The merchandise have been developed alongside introductory AI workshops, utilizing instruments from Google AI to get college students fascinated with the chances of the know-how. The groups have been helped by an all-female group of mentors.
“One of the real highlights this year has been the strengths of the mentors. The business acumen and the partnership between industry and schools that the mentors can give is not something we have much time for in schools,” stated Icke. “The curriculum doesn’t really allow space for that.”
Three prizes have been awarded on the finish of the occasion.
College students from Belvedere Academy and Norwich Excessive College For Women took first place for his or her progressive agricultural instrument.
Designed to assist farmers in growing international locations monitor their pregnant cattle, The Tail Gained’t Fail instrument gathers information on livestock to foretell when they may give start in order that farmers can intervene when most acceptable, due to this fact permitting them to allocate their restricted assets extra successfully.
College students from Northampton and Sutton excessive colleges got here second with iAid, an earpiece which connects to Google Maps and makes use of voice recognition and simulated speech to assist visually impaired individuals navigate.
The third prize – a Folks’s Alternative Award – was chosen based mostly on votes made by the scholars themselves. This was awarded to college students from Blackheath and Bromley excessive colleges, who developed a language app that interprets idiosyncratic phrases to assist autistic individuals higher perceive them.
The techathon was a collaborative occasion in partnership with Mortimer Spinks and Expertise Matter.