Mark Stickells on AI & the role of Humanities in STEM

ADSEI Logo with Make Me Data Literate across it
Make Me Data Literate
Mark Stickells on AI & the role of Humanities in STEM
Loading
/

Taking a different tack with this episode, a wonderful chat with the remarkable Mark Stickells, AM, CEO of Pawsey Supercomputing Research Centre, on AI and the role of Humanities in STEM.

“I fundamentally believe knowledge is a human insight. And we work with these tools that are incredibly powerful, but can be incredibly stupid.”

“We need the same level of literacy and critical perspective in this world of AI because it is incredibly humanized and it is easy to access. It can present information in a way that’s compelling. It can for those that are vulnerable and perhaps haven’t the right critical support lead to some quite negative outcomes and it’s moving incredibly quickly.”

TRANSCRIPT

Linda: welcome back to another episode of Make Me Data Literate. One of the absolute delights of doing this podcast is that I have interviewed so many and so varied guests who are all amazing in their own ways. And diverging from the usual topics and the usual questions today to interview the remarkable Mark Stickles, AM CEO of Pawsey Supercomputing Research Center. Thanks for coming, Mark.

Mark: Thanks, Linda, that’s quite an introduction.

Linda: (laughing) – I could have gone with more, but we’ll leave it at that for now. I could spend the whole episode. So, first of all, who are you and what do you do?

Mark; Yeah, well, I’m the, as you said in the introduction, I’m the CEO of the Pawsey Supercomputing Research Center, which is a national facility led out of Perth Western Australia. And we’re a part of the critical national research infrastructure supporting science and research activities across the country with some particular areas of strength and focus here in WA, but with a national high performance computing resource that supports universities nationwide. We’re actually in our 25th year this year.

So we’ve been operating at sort of the leading edge of the vast computing for a quarter of a century. And it’s been my privilege to be the CEO, now entering into my eight year, which is flown by and plenty of interesting developments in that time.

Linda: Pawsey is a remarkable place. It has the largest, I think, supercomputer in the Southern Hemisphere, Setonix, and one of the greenest supercomputers in the world. You have an Honours Degree in English Literature with a thesis about gender representation in James Joyce’s Ulysses, if I remember correctly. How did that lead you to be the CEO of a tier one supercomputing facility?

Mark: Well, I’ve reflected on this. There’s an expression that’s not mine. It’s a philosopher from a couple of hundred years ago, Kirkegard, who said about, you know, life’s lived forward and understood backwards. I’m paraphrasing. So it is interesting to reflect how I got to here, but it did start with an arts degree, a humanities degree at the University of Western Australia. Back in the 80s, at that high fashion, high impact period of our life.

And I, in that period, I was first in my family to go to university and the opportunity was a profound one for me to enter into a tertiary degree. I moved out of home to do so. And I think the pathway to me opened up through being exposed to education and research and universities, not in an academic sense, but almost in a business sense. I had a part-time job. I ended up completing my honors degree part-time because out of financial necessity, I took a job working in the admissions center of the Western Australian University system. And that became a pathway into an administrative role working in the research grants area.

So I had an opportunity to work either, perhaps in the public sector or in education, which a lot of people chose that path, but I found myself, through opportunity, working in the enterprise of universities, not just being a student and they were fascinating organizations to join. So in the mid-90s, at the start of the internet period, universities were on this thing called the AARNET, the precursor to the internet here in Australia. So we were in a position where I was able to build web pages for the first time and help improve and provide better services to researchers and postgraduate students in the role that I had.

And over a number of career steps, I completed other studies. I’ve done a couple of masters degrees and some other graduate courses, but always part-time because I’ve always worked. And that, on at least two occasions, I’ve had opportunities to move into roles that the technical lead of a particular role, for whatever reason in that environment, there was a change of partners in the organization, or there was a particular milestone in the growth of that organization. The decision-makers who appointed me saw something in my ability to be able to bridge different sort of technical domains, to be able to be, I think, an effective translator, to understand strategy and stakeholder, and perhaps at the heart of it all, to be able to bring people together and to help build things.

So I think I’ve been involved in CRCs, Cooperative Research Centers, in University Institutes, and joint ventures with industry around energy and resources, which have been particularly strong in the 2000s, early 2000s in WA. And so when it came to, what, eight years ago, the opportunity to join Pawsey, I think it was, I did not know,…I don’t know how to code … advanced coding. My last coding, I think I shared earlier, was in HTML1 in 1995 to build one of the first web pages. That’s not a particularly impressive coding resume, but my resume has been built over almost 30 years now with working in research, working in industry, community, government partnerships with our best and brightest, helping to focus on how we can bring the sort of talent that exists in our universities and research organizations to tackle some of the most challenging issues of our time.

And I’ve had the privilege of working across environment and agriculture, energy and resources. And now this area of supercomputing is one that’s, it’s an amazing area to work in. The ability to work, I joke at times, we’re like the digital Swiss army knife for science, we work with really cool things that enable us to simulate, analyze, bring to life, bring to an experimental sort of capacity, things that we couldn’t do in a lab, or do it in a timeframe, or with the right sort of modeling behavior and to manage masses of data in a way that enables scientific breakthroughs that I think can benefit communities, that help our, you know, healthcare, help understand our environment, or add to the stock of knowledge about our universe. Which is one of the foundational things that we do here in terms of supporting sort of world-class radio astronomy in Australia. So long answer.

And I guess I reflect, I talk a lot about having literacy. So I know this is a podcast, I’m doing my hand vertically, and the literacy across, and expertise going down. I translate, so the T, they talk about T-shaped sort of people, I’m a T. I have some breadth and a little bit of literacy which helps me in a world where systems for managing knowledge and research and your superpower, Linda, in data. I mean, it needs tools to help unlock insights and create new knowledge and I think benefit the communities that we serve.

Linda: That’s awesome. And I know that Pawsey has put, we’ve had a couple of Pawsey employees on the podcast in recent times. And it seems to be a theme that Pawsey employees, people who can do that translation piece, and that communication, and they may not have the precise skills for the job that they’re going into, but as a regular listeners will have heard me say before, the technology is actually quite easy. It’s the people that are the hard part, and that communication piece is really important. What skills did you take from your arts degree that are relevant to technology in general and data science in particular? And how do you use those skills in your job?

Mark: I think I’ve a critical thinking and critical analysis skills that I think are fundamentally important. So that and the times I say it’s often more important to ask the right question as opposed to know the answer. I’m working in an environment where people have all the answers, are incredibly expert, but it’s really important to ask the right questions. So the “how we do something” needs to be understood in the context of why we do something and that doesn’t happen enough. And so I think with an arts degree, for me it was the starting point to a career in learning, and to continue to want to learn and to respect different perspectives to understand. We often, as I said earlier, we understand by looking back, but we have to live life forward. So what we learn from history is something that in all times, and I’ve been lucky to live long enough to see a few waves of difficult times and in some aspects of what we’re going through now, globally, there are difficult times, and what we share in those insights, hopefully can help inform better decisions as we go forward.

And I think that’s intrinsic in an arts kind of education, humanities education. And so I absolutely respect the technology, and have a degree of literacy, and my colleagues might argue not not entirely competency, but there’s a different emphasis. And there are ongoing jokes about not letting me near the data center, the data hole and actually bumping into things and making them not work. But that’s another story. But the ability to ask the questions and to be critical, if you’re cool, understand the frameworks for knowledge creation and knowledge sharing.

Gosh, in a world that we’re dealing with now, where we think that a bit of silicon is going to be insightful, because we have billions of those little transistors, those little electrical circuits, generating knowledge. I fundamentally believe knowledge is a human insight. And we work with these tools that are incredibly powerful, but can be incredibly stupid. I mean, they can be incredibly, again, I’m choosing my words here because if I say insightful, I’m attributing the technology with a human quality. And I think we do that a little bit too freely. So I often reflect that we work with some of the most powerful bits and bytes in the country, but the hearts and minds are important. And so that aspect of, and that’s just not a stakeholder management reflection, that’s just an important principle for what we do here with the privilege we have to work with amazing people with amazing technology and tackle some of the pressing issues of our time. The ethical and sustainable… the privilege, the things that with those things are questions that aren’t just answered by a program in a computer.

So if you come to Pawsey, again, I’m waving my hands around, there’s a, we have a decommissioned GPU that I wave around and I just point to people and say, look, this is what’s powering the systems. And we’ve hundreds of these and they’re enabling wonderful, potentially wonderful tools that might help improve the availability of information but the decision making and the sources of those information need to be understood and they’re not always properly attributed or properly curated. And then the applications are in some areas questionable.

And so one of the things that researchers have is, should have is a rigor about their method, about their benchmarking, about their analysis and the way they present their data and the insights and their recommendations that they offer. And that needs human steps and human checks and balances in many, many levels. So, which is your world, Linda.

Linda: Well, that’s it. And I was in a recent interview for this podcast with Kat Ross, she made the same point that,

Mark: Superstar Kat

Linda: People are so generous with their time, it’s such a joy! … She made the point that if she just kind of farms her data off to a system, it can only look for things that it knows how to look for, and look for things that it’s looked for before. Whereas if she sits down and actually goes through the data herself, she’ll find things that the system can’t see because it’s not set to look for that, because nobody knew it was there. And that human element is really important, that level of insight and critical thinking, as you said, is something that we forget, in our rush to adopt AI and have it do all of the things. There’s things that it just can’t do.

Mark: I don’t think so. I might sound a little, I can’t think of a bit of word, Pollyanna-ish about critical thinking. Critical thinking can be malevolent too. (laughing) And then the privilege of working in the space that I’ve been working in for my career, I’m disposed to it being for the greater good, people working their careers in universities and in research organizations. I believe, powered by curiosity, powered by belief in the transformative power of education, but in our system of society, these are important pillars that are educational institutions and research organizations. They provide a really important pillar in a community, and they have done for a couple of thousand years. It’s not that this is a new thing, but we’re having that tested by believing it can be just replaced by some code and some hardware, and the generation of insights from the research community starts from these largely now becoming the very large models that even in the way they talk, and – they talk! – I’m trapped every time. As soon as I start talking about this, you get into this anthropomorphism, I can’t say the word, cut this out. Anthropomorphism, help me out here, Linda. –

Linda: Yeah, you anthropomorphise them –

Mark: Yeah, you do it, it’s almost a little insidious that we give it human qualities when we should be doing that very carefully or with proper consideration. I’ll practice the word again later, but. (laughing) –

Linda: Humanize is your safer option, I think. (laughing)

It’s interesting to talk about this idea that we can replace people with large language models or the rather misleading name of AIs. What concerns do you have about the chatbots and AIs in general? And it’s hard to use the term AI because there are many things under that umbrella and there are great systems developed at Pawsey using AI or machine learning to do amazing things – it saves lives and it discovers all kinds of stuff. But there are issues, what are your concerns?

Mark: Well, I’ll bounce it back to you and an unsolicited advertisement for ADSEI here, Linda. But one of your missions is to make the world more literate in the world of data, you know, Make Me Data Literate, is this. We have insufficient literacy in, and you speak to a humanities person who did a bit of literature, I mean literacy in this space, literacy in the world of technology, it’s really important in the world of data. And in some respects, you’ve made great presentations and commentary on what happens when data is misrepresented. If we don’t understand the scales and the charts and the method to produce this data and where it was sourced, all of those things.

We need the same level of literacy and critical perspective in this world of AI because it is incredibly humanized and it is easy to access. It can present information in a way that’s compelling. It can for those that are vulnerable and perhaps haven’t the right critical support lead to some quite negative outcomes and it’s moving incredibly quickly.

So, the literacy aspect that you apply to data, it’s the same for me with AI, what do we understand? I used the expression in a presentation I gave to a community. It was… my parents are in a retirement community and they asked me to present a couple of months ago on AI. So I went in there one evening and it was 50 or 60 people. I think the average age would have been well into their 70s and they had some great questions for people that are looking at this, wondering what it means to them.

And I said, if I said to you, I was going to calculate something and here’s the result. Or I said to you, I’m going to infer something and this is what I’ve inferred. What would you do? And a couple of people said, well, I believe you’re calculating with numbers. I know that’s a result. But if you’re inferring, you’re… and so I had a great conversation started like that. And then I said, well, a lot of computing is based on calculations. So it’s known numbers to give you a known result. It’s a precise formula that we can give us a result and there’s a whole body of information and data that can give you that calculated result. But this current wave of AI, we’ve had AI for decades in narrow senses, but this very broad application now, it does a lot of inferencing. It tries to match things, either through patterns or statistics. And so it will give you a result based on inferring. And people were looking at me going, oh, and I said, well, yeah.

But in their world, the way it’s presented to them on a screen, it’s got this human element, it’s got this inferred authority that is perhaps, that is in my mind overstated. And we haven’t equipped different user communities to operate with them. So this same audience, a lovely audience, it was lovely to talk to them. So the other point that I wanted to talk which gets to data, was a couple of the guests in the community hall asked me about using applications, and applications on their phone and was it safe to do this? And this application looks really good. And but it’s free, but could I pay for this one? And I said to them that my bit of advice was that there’s no such thing as a free application that every time you’re using an application on the internet that is free, it is collecting some data about you and you’re paying for it with your data. It’s about your location, some sort of demographic profiling of you or it’s something about the patterns of your use that are of value, not necessarily to the product that you’re interacting with, but potentially to some other actor. And so be very sensitive to that.

And so they were my two little takeaways. So think about your own data and have some understanding that that is valuable. And when you are on these applications, that it is of value to others. And that there’s a difference between, if you like, a calculator, and inferring things and that therefore be critical about the information that you source from tools like the language models. But at the same time, I’ve shared that they’re using AI to help them, the best path to get through traffic or helping to predict certain things in your local environment or there are very valuable things that we use, but that it’s the universal panacea solution for all things. That some of the hype that we see in the press is something I’m very cautious to sort of replicate. I think care is needed. We need to ensure that there’s support for, there’s a lot, there’s debate around sovereign investment in this area. I think there’s a need for national investment in the capability and the infrastructure and the data security in this space. But equally as we do want to build our roads, ports and rail infrastructure, it’s something as important as those.

Linda: Yeah.

Mark: I’ve just got side tracked on another topic as well. Sorry. (laughing)

Linda: You’d be amazed how often that happens. And that raises a couple of really interesting questions because I note in your early discussion about talking at the retirement village and they were saying, “Oh yeah, they’re really confident about data that’s calculated, results that are calculated as opposed to inferred.” And that’s problematic in itself that, when we’re presented with results, when we’re presented with numbers or graphs, we do tend to believe them a little too uncritically, which is part of my work.

But also the comment you made about, if the system you’re using is free, then you’re paying for it with your data. Unfortunately, I think that’s universally the case. It doesn’t matter whether it’s free or not. If it’s free, you’re definitely paying for it with your data, but more and more, even if you’re paying for the system, you’re paying for it with your data because companies are realizing that they have multiple revenue streams here. They can sell the product to you, but then they can sell your products, your information to other companies. So the lines are getting blurred in both cases. The hard result that’s calculated isn’t maybe as hard as we’d like it to be in all cases. And understanding the edges of these things is really tricky. Do you think that some of this is kind of a result or at least exacerbated by this habit that we’ve got into of keeping STEM in one box and humanities in another? Now, I’m gesturing.

Mark: It definitely doesn’t help. I read a piece in a national paper a month or two ago that said that soft skills were in demand, which was a way of saying that AI, boom, needed to have a different….critical thinking and a different set of skills to manage what was going through the sector. And I think it doesn’t help. I think it actually helps to have humanists and historians and philosophers also asking the questions in these situations. It’s not just about what the technology can do and how it can work and how we best apply it.

Being inclusive of different views is incredibly important. And to me, it’s often at the intersection of perspectives that the real breakthroughs happen. And so if it’s technology driven, technology led, we’re missing, we’re definitely missing what’s an important insight. So I tend not to like the polarizing of STEM versus humanities or hard versus soft. I, and I’m not sure I can properly reconcile putting STEM, having the A in the middle of it as well. I like the idea that there’s an inclusion of perspectives and that’s, I often see it, I often give a presentation and someone will say, yes, I’m trained in this, but I really, this is what I do for either interest or my enthusiasm is for this area. And it’s quite different and they bring in different perspectives. I come back to my earlier point, I’m asking good questions. And I do think perhaps by default, the liberal arts is driven by questions in a different way to perhaps the STEM powered subjects, but they’re still based on hypothesis and experimentation and principles. So I find it, I find, humanities colleagues need to do a better job of selling their credentials. And I don’t want to be critical of our STEM colleagues who’ve done a good job of saying, these are really important. So perhaps that’s the point I’ll land on is that colleagues that perhaps have a background like mine, and they do pop up around the place, need to do a better job of advocating for what they can contribute in this world of science and technology that’s incredibly important.

Linda: It’s like medicine, right? We have these deeply expert specialists in cardiology and immunology and we treat them as separate and distinct systems, but they’re not. They’re in no way separate and distinct. They interact and they affect each other. And so we have to treat the body as a whole, not just as a heart on a circulatory system on its own. That doesn’t work. Speaking from experience, it’s a very complex system and it can go wrong in all kinds of ways that are a little bit from one system, a little bit from another system and something thrown in from over there. And if we try to make them really distinct silos, it makes it really difficult to manage the whole body.

And the idea that we can keep STEM and humanities separate and that they are entirely distinct disciplines, I think is equally flawed.

Mark: Yeah, I’m glad you didn’t get talking about needles. My aversion to… (laughing) – A longstanding joke, listeners, that I’m not very brave when it comes to medical procedures and to needles so. But the point’s really important. I think there is, if anything, this automation of knowledge, it’s requiring our disciplines to come together even more, even more so. I think the need for, you know, there’s specialist insight, but there’s a need to bring together things at the intersection and it does come down to people. So I think the people aspect, the technology can do whatever the technology is sort of pointed at, but if we want it to have community benefit, to be sustainable, to help those at disadvantage, you ask different questions, often ask different questions to that of purely technical view.

Many respects, my colleagues tell me that the technical bit is kind of the easier bit, the dealing with the benefits and the risks and the communities that we both serve and impact. That’s harder than coding in ones and zeros, and the computing getting bigger and bigger and the data getting bigger, is opening up more services. But the ethics, and the environmental impact, and the disadvantage for some that don’t have access to the benefits of these advances, they’re important questions. And I think that’s where humanities researchers and colleagues in those array of disciplines can make an important contribution to the debates of the day.

Linda: There’s a joke in the tech industry that every app becomes a stalker app within six months. And it’s not as untrue as I’d like it to be. And often I think the reason for that is that lack of diversity in the room that was designed by a bunch of guys who were really excited about the technology and who just didn’t think about the fact that, oh, yeah, what if we add sharing your location? That’d be so cool. And then your friends can see where you are. That kind of stuff happens. And you know, we’ve also seen the issue of, in the US of people sharing, using apps to monitor their menstrual cycles and with the whole attitude to abortion over there and stuff, that’s now actually a terribly dangerous thing to do. Because no one’s thought through the implications. There was no one in the room who went, “Wait a minute, is this actually a good idea?” And I think that comes back, as you say, to asking those questions. Well, hang on, what could go wrong if we do that? This is fun and it’s interesting and it’s exciting new technology, but are there downsides? What are they? There’s always downsides.

Mark: Technology in itself is neither ethical or humane. It’s agnostic to these things. The human application and the human interaction and you know, the human activity driven by the technology that is the issue. So if you’re not having the human questions and the rest addressed, the technology is agnostic to these things. It is, I’ve tried to, I won’t say the word again, ’cause I’ll say humanized rather than the A word. (laughing) But it is, so we’re finding ourselves … it’s becoming harder and harder to make that distinction. And I think it’s really important to have different voices in the room to make that distinction.

Linda: Yeah, yeah. We see from the public response to things like COVID and climate change that science isn’t always as effective as we’d like it to be at shifting public perception and behavior. Is there a place for humanities in these conversations?

Mark: Absolutely. We, which probably doesn’t surprise you for me to say that. (laughing) Various technology platforms and the behavior of some actors has eroded trust in systems that have served us well for hundreds, if not, several hundreds of years. So it is concerning. I, there was a great report released by the Academy of Science last week at the Shine dome in ANU, that’s just pointing to various disciplines and gaps in the pipeline of skilled people that are coming through our national university system and projecting out to 2035 what that means. And so useful data framework to help inform, data-driven framework to help inform decisions across the sector, which are important for sort of workforce planning and those sorts of investments.

During the day, I had the privilege of being there for the day. One of the most impactful presentations on the day was by the president of the Australian Academy of Humanities who presented what has been the evolution of the university sector and where we came from post-World War II to now, and the various major evolutions and revolutions in our sector. And it both presented insight into the challenges that we’re facing today. It presented a bit of hope that we can actually continue to evolve. We’re incredibly resilient in this space, arguably.

Right now it feels like the world’s changing very quickly around us. But if we take a step back and perhaps look back to look forward, we can see the seeds of resilience, the seeds of opportunity, the seeds of insight from what we’ve learnt and perhaps apply some of that insight into the difficult decisions we have to make going forward. So that sort of perspective, and I’ve ignored not intentionally, but I also then think of the important place of indigenous knowledge and indigenous cultures in our Australian context. That’s a privilege that we have to have a community and hundreds of communities that have been connected to this country for tens of thousands of years and also have different insights into knowledge and storytelling and the power of stories, which I think is a theme that you often have with your work in data, that it’s data without a story. It’s musical notes without a tune. I don’t know whatever the expression might be. I am tone deaf and can’t play musical instruments, so you can put notes in front of me. No one would hear it, but I can perhaps string a few words together and tell a story. So there is something, there’s definitely an important role for our humanities and for alternative perspectives. And I’ve also shared that even back relatively short time in terms of the arc of knowledge, if I think of First Nations people, but I think of Socrates was saying that if we started to write things down, we’ll become less intelligent because the oral tradition of knowledge sharing was his, the Socratic method. I think we thought that the printing press was going to make us all, having books was going to make us less intelligent, that we’re having conversations about what’s happening with the social media. The internet was meant to democratize knowledge and make it incredibly available at its time in the 90s. And I remember being in an early career thinking what this amazing opportunity to, to not go into the closed reserve and seek to take a book out of the closed reserve and record it for myself from a library. But within a year or two, I was actually accessing this from anywhere from my computer. So that was a wave. I think we’re seeing that now, another wave. And I can’t help but remain fundamentally optimistic. I’ve worked in the sector for 30 plus years now, which supports education, which supports curious, enquiring, committed people to work in research for varying reasons. Some it’s a very motivated to solve a particular problem that’s going to save lives, another group might be interested in understanding, and I think about understanding the place of this planet and its evolution. And that’s what motivates them every day or understanding how to produce sustainable energy. So to, to secure communities. There’s all, or capture the important aspects of our history and not lose that. So that future generations understand both the lessons and the insights of the past. So, so I’m a fundamentally, I think I’m in the optimistic sort of category that we’re in a wave of technological, technologically driven change at the moment. There are some challenges to systems of government and our geopolitics, but I, I honestly get up every day, this might sound a little bit kitsch, but I, I do believe that we do important things and that the people that I’m privileged to work with at Pawsey and through them, the thousands of researchers that work every day to try and understand our world or to contribute the tools and solutions and insights that help make it better is what I really enjoy. It’s a privilege to do that.

I had another little anecdote. I don’t worry about stock prices. I worry about the stock of human knowledge. That’s one I haven’t put out there before, but the talk is about that, that stock is important. And it’s that insight that is being tested, that many institutions have lost the trust of the community, the way we engage with the media, the way we engage with government, the way the technology platforms can present information as if it’s authoritative and we don’t, we can be deceived by that. We don’t have the perhaps inherent confidence or training to be critical, which comes back to the, it’s not just STEM, it’s not just humanities, it’s all sorts of literacy that we need to be concerned about. And that’s what motivates me as well.

Linda: That’s a really nice ad for for the whole kind of critical thinking thing. It’s a beautiful place to end, that, we need that literacy and we need that thinking, but also that there’s amazing, amazing work happening. And it’s all about where you direct your attention, isn’t it?

Mark: Yeah. And I think that we don’t, and this is a call to action for our university colleagues and those that work in sort of research in agencies and across the country, that your work is valued, that the platforms to promote and respect that at times aren’t always well aligned. I think we can all do a better job of celebrating our committed research community and the people that they touch and support through their own discipline areas. So I’m very agnostic to the disciplines, Yeah, so the work that Pawsey does, it supports the arts right through to the most advanced astronomy. And we’ve done some cool things in quantum computing as well, which is probably the subject of a whole nother episode. What does that mean? – Yeah, that’s the last episode I put out, I think with Pascal.

Mark: Pascal is a rock star as well. So I love the fact that we get to work with so many very, very cool people who have all had, a bit like me, many of them have had a, not a straight line career path into the world of HPC, but one that has taken some left and right turns and found themselves in places where you do get to work at the intersection of leading technology, leading science, and very curious and committed experts. And it’s a very cool place to be. And several of my colleagues have done some wonderful things to help create an environment where we can continue to attract people to work in this space. And that is now more than ever an important pursuit to support.

Linda: Thank you so much. This has been a wonderful conversation, a little different to the usual, make me data literate with some variation in the questions, but I think we’ve explored a really important space. I’m actually gonna follow it up, well, at some point once it all gets scheduled and stuff with some amazing folks in the world of qualitative data, because I realized I’ve only done quantitative data and quantitative data can tell you the what, but it can’t tell you the why. And I think that’s the place of qualitative, but also the place of the humanities to work on the why and then should we?

Mark: Well, and for a little follow up, if you want to go to the Academy of Science website, there is a streaming of that talk that I mentioned. You can actually, you can see talks from the president of the Academy of Science from the president of the Academy of Humanities and some interesting panel discussions about the state of science, technology and workforce and the opportunities ahead. So that’s on the Academy website. And I think it’s available to anyone to see. And I’d recommend for those that might be interested in the history of the Australian sort of university research system, the president of the humanities Academy’s talk was a great, that’s a 20, 25 minute talk. I’d really recommend it. It’s worth listening to sort of how we got to now.

Linda: Fantastic. Thank you so much for your time.

Mark: Thank you.

Leave a Reply