Teaching AI to UNSW Actuarial Students
When I became a lecturer, I never expected to be assessing the lyrics generated by a student’s AI that was trained to produce Taylor Swift-like pop songs (e.g., “ohohoh loving him was red touching him was like realising all of drop it all eyes and I…”). How did this happen?
Last year, I created and delivered a new course in UNSW’s Actuarial Studies program under the title Artificial Intelligence and Deep Learning Models for Actuarial Applications. Many of the students – who were quite brave to enroll in a brand-new course – came without any deep knowledge of AI or Python coding, and learned to solve a variety of machine learning problems using neural networks.
I focused on creating a very practically oriented learning experience. For each AI technique I covered, I made sure the students knew how to implement it themselves and made sure they practiced doing exactly that. The only way to learn any programming-related skill is to just fail at it repeatedly until the ‘intellectual muscle memory’ is built up.
So, in the first weeks, they learned enough of the Python programming language to be dangerous, then in following lectures, we discussed the specifics of handling tabular data, time series, images, text, and methods for generating some of these data. The students experimented with their new AI toys every week in a peer-to-peer formative assessment style that my colleagues here call StoryWall. Some of these included:
- Chess: Make a traditional rules-based AI to play chess.
- Motor: Predict claims frequencies for a portfolio of French motor contracts.
- Stroke: Prediction incidents of strokes given health information.
- Stocks: Use Recurrent Neural Networks for time series forecasting of stocks.
- Hurricane damage: Given a satellite image, use Convolutional Neural Networks to classify whether a hurricane caused damage to the property or not.
- Police reports: Given police reports of motor accidents, use Natural Language Processing to classify whether or not serious bodily injury was caused by the accident.
Over the whole term, the students were given free rein to design their own project, which they would present to the class. This would be an end-to-end machine learning project, where the students set their own goal, found the required data, cleaned & massaged it, fit a variety of deep and traditional models, and selected a winning model based on its performance in hold-out validation sets. Alongside the coding, they also analysed the ethical dimension to their chosen AI task; even the earlier ‘harmless’ Taylor Swift example highlights serious ethical questions relating to an artist’s ownership over their style & voice.[1]
While I did specify a ‘baseline’ project – using the human mortality database to project mortality rates for a chosen country – I encouraged the students to make their own task, and I was shocked by their creativity. One student (obviously a true Sydney local) chose to predict house prices across the suburbs of Sydney based on an aggregation of scraped real-estate data and government economic statistics. Another used BoM data to predict weather in local suburbs. One of my favourites was a project looking at NASA radio telescope data and training an AI to find exoplanets (‘planets around other solar systems’, I had to Google this).
Exoplanet-detection & Taylor Swift music may not seem particularly actuarial, though let me describe the technique behind the pop-music generating AI. Given a collection of text data, you simply train a neural network to predict the next-most likely word in a sentence given the words that have come before; this is called a language model. Does this sound familiar? I imagine so, because the biggest difference between this fun generative project & ChatGPT is the ‘large’ in large language model.[2]
AI, driven by deep learning, will have significant and unforeseeable impacts on the way we do actuarial data science, and now is a good time to be picking up some of these tools. I taught the students how to make a language model just for fun, but had no inkling that just a few months later OpenAI’s language model would become the fastest growing consumer product in history & bring about a radical change in how we use computers. To even begin to tackle the difficult challenges of the lack of interpretability of deep learning models & uncertainty quantifications of their predictions, we need to have a firm grasp of their current capabilities. Luckily, AI is not really that hard to learn, and it is fun to boot! My open-source lecture slides for the course are available on GitHub.
References
[1] www.nytimes.com/2023/04/19/arts/music/ai-drake-the-weeknd-fake.html
[2] The student’s AI was actually generating lyrics one letter at time, unlike most language models which generate text word-by-word (roughly speaking). I find it remarkable that this crude technique can create anything even vaguely resembling English words!
CPD: Actuaries Institute Members can claim two CPD points for every hour of reading articles on Actuaries Digital.