An amazing chat that ranged over the power of data storytelling, the impact of scientific fraud, and how to stay positive in the face of environmental catastrophe, Professor Euan Ritchie is a remarkable science communicator, an Ecologist, and a seriously lovely human being.
“Data is power and knowledge, and if we have good data it gives us opportunities to make better decisions, and make choices between different decisions, about which decision might be the best, and the most effective, and the most efficient.”
“Being taught by Professor Terry Hughes, Coral expert, and Professor Stephen Williams, who’s an expert in mammals and birds and reptiles of the wet tropics of north Queensland, they were both warning me in the mid 90s and early 2000s about what the possible impacts of climate change would be, either for coral bleaching, and animals having to move up mountains as it got hotter because they’d run out of places to live. And as worrying as I thought that was at the time, I also partly thought maybe they were being a little bit pessimistic and it won’t be that bad. But what they said has come to pass, and in fact in some cases far quicker and more severely than we would have hoped. So it is really upsetting, but we can’t give up.”
Transcript
Linda: Hello and thanks so much for joining us for another episode of Make Me Data Literate. We’ve had two economics episodes recently and so this is a big change of pace and I’m really excited to start talking about some conservation ecology and to do that we have Euan Ritchie. Welcome. Thanks so much for coming.
Euan: Thanks for having me.
Linda: I’m very excited. So to start off can you tell us who you are and what do you do?
Euan: Sure. I’m a professor in wildlife ecology and conservation at Deakin University and the what do I do part really is trying to understand how species interact with each other. So you can think about as an example predators and prey, and how that’s really important to influencing what happens in ecosystems in the environment and by having better information about those interactions we can make better decisions about how to manage and conserve populations. So as an example you might think of you know red foxes or feral cats and how they interact with native wildlife like bandicoots or bilbies or some other native marsupial, and the things that we do, in terms of fire management and other forms of management, if we can have a better understanding of how those interactions take place. We also therefore have better knowledge which comes back to data about what we might do to have good outcomes for our native biodiversity.
Linda: That’s I love that idea that you know if you actually measure things you can you can understand them and then use that to feed into the way you manage them and the decisions that you make. I feel like there’s not enough of that in the way we govern and I suspect based on your Twitter feed I think you feel that way too. I think I’m in raging agreement with you and you know it’s a common sort of said thing that data is power and knowledge, and if we have data – and good data and I’m sure we’ll get to that because there’s there’s data and then there’s data – but if we have good data yeah it gives us opportunities to make better decisions and make choices between different decisions about which decision might be the best and the most effective and the most efficient. so everybody loves data or access to good data because it gives you that power.
Linda: Yeah that’s right and also learning to identify as you say that the good data and the bad data and the question the representations that you’re shown and things like that
Euan: absolutely
Linda: I guess and we’ll come to more of that later. What did you have to learn to do what you do particularly about data was there anything missing from your formal education?
Euan: Yeah look I did a decent amount of statistics in my second year of university a whole year in fact and that’s not necessarily always the case now for students undertaking similar degrees. I think it’s fair to say that very few ecologists, environmental scientists, conservation biologists, are probably prepared for how much statistics are involved in these fields and how sophisticated those statistics are. So you know there’s a running joke in ecology that ecology is not rocket science it’s much harder than that, and and that’s because there are all these moving parts. There’s a lot of noise and we often talk about noise in data sets. We’re dealing with really complex you know challenging issues and we therefore need really good data good robust data sets and also in many cases highly sophisticated statistical approaches to make sense of what that data may or may not be telling us.
So yes I was given statistical training but probably not to the degree that I needed but very few are and I think there’s another important thing I think in terms of environmental science and ecology and conservation in general is that some people have a natural affinity with data and you know with mathematics and things like that and I wish I was one of those people. I’m just not and it’s something that I’ve sort of had to work at, but there’s obviously other really important skills as well in terms of just being observant so you know having your eyes open and noticing things in nature and sort of questioning those.
Taking notes and as is the scientific process you know creating hypotheses and going oh maybe it’s this or maybe it’s that and then undertaking an experiment or you know observational study collecting data and then testing those hypotheses. so you know I think most above all really it really is about questioning things. Really questioning things and I think that’s what I love about having a science degree is regardless if it’s the environment or anything in my life I question it all the time and go – is that really true? and where’s the data and evidence to support that? so I think that that’s a really important lesson.
Linda: I love that that’s that’s key and I think you know it’s not always the way we teach science particularly in schools we have this my sort of one of my standard drums that I beat is that we tend to teach science as a matter of facts and yeah processes rather than as a discipline of questions and I think it’s incredibly important that we do ask more questions but that we ask rational evidence-based questions rather than YouTube rabbit hole kind of questions.
Euan: absolutely and it’s challenging because as we collect more data sometimes of course we challenge you know long held theory and so forth, but that’s a healthy part of the scientific process, right that we go actually that’s not true anymore you know or there’s the evidence for that sorry is not true anymore and and you know you collect more data and that that’s a healthy way.
But I understand that’s challenging for the public because you know they’ve sort of been told a certain thing operates a certain way and some people might accept that and then we as scientists come along and go “actually we might have been wrong it might be like this” but if we don’t do that and we just accept things on face value all the time that’s far worse
Linda: oh absolutely and that idea that that science is working when it comes back and goes actually the evidence that we have now is not covered by that theory and it shows that we’ve just disproved it, so we got to move on and figure out something new. that’s science at its best and it’s absolutely held up by by the less rational skeptics as, see science is wrong! and you’re wrong! it’s not working! like no no this is this is what it’s supposed to be! I was interested to hear you say before that you felt like some people were naturals at mathematics and that you’re not one of those people and I because education is my thing I I like to think that it’s less a matter of innate ability and more a matter of how you’ve been introduced to it and
Euan: oh I completely agree I think it’s how you’re taught and you know and nothing against my teachers in the past. you know the vast majority of my teachers were wonderful and I think teachers in general are just the most amazing people and they all deserve gold medals. I just think that in the past the way that I was taught mathematics was probably very simplistic. there was you know there was one right way or wrong way as an example to solve an equation.
we now know that’s not the case anymore we now know that even in primary school and even with my children, that different kids are taught to solve things different ways because that’s how their brains work, and that just wasn’t my experience. so I absolutely agree with you that everyone has the ability to you know connect with you know data and mathematics and and the wonder of that too. unfortunately I probably just didn’t. and I think it’s very easy that it does float around in society I think this idea that some people are good at maths and some people are not, and as you rightly point out it’s probably more the opportunities that you’ve had you know to really get help if you need to or approach things from a different perspective. and I’m really glad that that’s actually happening now in schooling much more so than when I was young, so yeah
Linda: yeah I’ve seen kids come out of primary school saying that they’re no good at maths because they were low down on the times table challenge, which meant that they weren’t good at memorization. that’s not maths.
Euan: exactly
Linda: yeah and and you know thinking deeply about things and coming up with innovative ways to solve problems is not typically tested in school maths and so you know those are the skills that we need. those are the the mathematicians that we want really.
Euan: absolutely
Linda: I promise I didn’t set Euan up with these talking points but he’s like hitting all of my favorite like yes! moments. is there one thing that you wish everybody knew about data like one thing they would just change everything if everybody understood that?
Euan: Ah, the short answer is no, only because data is only one part of the equation. excuse the pun. so you know we know with climate change as an example that there’s been plenty of data. so much data that says this is what’s going on, this is what’s very likely to happen if we don’t do this, then of course it does happen and often far worse than the scientists said it would, because the scientists by their very nature generally pretty conservative, but things aren’t changing to the degree that we would like and that’s a communication issue as much as a data issue. so I think we absolutely need, you know really robust data that we can have confidence in and again you know infer from that what we think is happening in in our world whatever the issue might be and then make good decisions and actions that come from that.
but the other equally important you know part of that equation is communicating that and getting people to actually, you know action, you know based on what that data is suggesting. so there’s there’s a whole bunch of data that I would love people to understand better about you know consumption, as an example. so population size I think is a really interesting one in this context so I often see debates online about the environment and people will quickly say oh it’s because there’s too many people in the world and I sort of recoil a little bit from that because yes there is a lot of people on the planet but there’s a tiny minority that are disproportionately using all the resources on the planet there’s a large number that are not so there’s an equity issue there and really what’s going on is a massive consumption problem. now that’s not to say that if we keep increasing our numbers there won’t be problems, of course there will, but that’s an example I think of where you know one piece of data – just focusing on population size can be quite misleading and arguably you know has really I guess problematic cultural connotations and so forth about who gets blamed for what so yeah.
Linda: yeah it’s um it’s about the storytelling isn’t it? it’s about you know how do we how do we get this across and and convince people the need for action and we actually know that showing people data that contradicts their position actually embeds them in their position doesn’t persuade them
Euan: that’s right, they dig in
Linda: which is that quirk of psychology where you go wait that’s not I don’t want it to work that way but yeah it’s a it’s a skill that I think needs to be part of every science degree probably part of every degree yeah but that’s a different argument I guess. what are some of the worst data mistakes you’ve seen?
Euan: yeah there’s a whole range of different data mistakes I think one of the most basic ones and I’m sure you’ve seen this is how data is actually presented so as an example graphs oh one of my pet hates. you’ll see pie charts as an example that don’t sum up correctly so they’ll have the wedges that if and they’ll have the numbers inside the wedges and if you just do the simple math on that you realize it doesn’t equal 100%. having axes on graphs that distort the pattern. so if you’ve got two graphs with different axes but they’re shown as if they’re the same, and then you look at the pattern and of course one looks far more convincing than the other – that’s a trick of the eye and so that’s a real problem.
There’s also the issue of what are often referred to as outliers and I even remember in my undergraduate training being taught about outliers in in data sets. now there are genuine statistical outliers that could be due to error – so data collection error – in which case they need to be removed from the data set, because otherwise if you analyze those they will give you a spurious finding. But there are also outliers that are legitimate. so you might have a population of animals and one animal might be way taller or way bigger or you know do something incredibly different compared to the rest of the population.
and if you just throw that data point out because it affects your statistical relationship because of course it you know it’s skews it in one direction because it’s this big outlier, that’s actually not appropriate. That’s that is a real data point and that’s actually telling you something about that population. now there’s ways of course of dealing with that by presenting data sets in you know one with the the outlier removed one with it in it and talking about that and what influence it’s having on the relationship and you know and so forth. But yeah there was this idea I think probably more so in the past, but you know about actually deliberately removing outliers because it was inconvenient for statistical tests and that’s that’s a big no-no.
Linda: Is that what you were taught to do at uni to remove your outliers?
Euan: I wouldn’t say I was taught to do it but it was definitely something that I observed and I think it was quite common that yeah. There was even statistical tests you know to determine whether a point was an outlier or not and then based on what that threshold was you threw it out or not. and again I think we’ve now come to realise, and maybe also with the sort of development of better statistical methods more sophisticated methods that deal with outliers better and so forth, and deal with non-linear relationships as well, that’s another big change in general, is that we sort of moved away from that idea of outliers unless they’re legitimate outliers. and then I guess another really important issue particularly in my field of science is observer bias and errors. so as an example if you were surveying a population of possums by spotlighting, so you have a torch and you walk along the road, and a person counts how many possums they see each night. if the same person does that each night that’s fine because they have the same ability each night. if another person does that they might be much better or they might be much worse, and if you don’t factor in that observer bias again that can influence your data set, particularly if you’re collecting different times of year different locations and so forth, so yeah thinking about how data is collected, how it’s presented, and how it’s analysed, is absolutely fundamental to you know areas that I’m in so environmental science and ecology and conservation.
And also because in many cases we often don’t have many data points so you know if you’re working on critically endangered species or threatened species or you know some particular facet to do with ecology we often don’t have that hundreds or thousands of observations. we might only have 10 observations or 100 observations. and to some statistical test that’s really up against it. so each data point matters and so therefore we do agonise over that as well.
Linda: I think that’s that’s um you know one of the fundamentals of of teaching data science properly is that idea that first of all what is wrong with my data, and and what do I know and what do I not know about this data set, and and therefore how many possible solutions are there to this, you know, this analysis that I want to do?
how many reasons might there be for what I’m seeing in the data? and and instead of coming out with what we what we like to do in schools a lot and in universities to come out with the one answer and the perfect, like, “here is the answer from this data set”. that’s not how real data works. and and to teach that idea that there is uncertainty and we actually need to spell out the uncertainty and and to list the issues and you know really drill down into how many ways might I be wrong, and what I think this data is telling me it’s it’s a really challenging skill like it’s confronting!
Euan: it is and it gets back to communication and you know a lot of the sort of work that our group does is correlatative. it’s observational. um and that’s not to say that those approaches you know aren’t useful, they absolutely are, and in some cases it’s all we have, but again therefore you have to be really careful about the language you use. so you talk about you know associations that might say this or might suggest this but you can’t come out and say “this data shows this” like we can’t say that. but you see that all the time you see that regularly in particularly in the media and so forth. they say “this data shows this” or “if you eat bacon you will get cancer” you know. and then you read the study and it says actually if you eat this much bacon you have this you know tiny increased chance of getting cancer as an example. so again how we communicate you know data and what it’s actually saying is so vital
Linda: it really is but it’s it’s not um it doesn’t fit very well with the the sound bite and the
Euan: no it doesn’t
Linda: Clickbait headline and like and also and you know there is a reason why soundbites and clickbait headlines are so popular and so effective, because we like them. we like neat encapsulated certainties. we you know I want to know is it safe to eat bacon or not? and the answer is it depends, you know I want to know is it safe to get vaccinated or not and it’s like well it’s safer than not getting vaccinated. but there’s no such thing as 100% safe. you know sitting here at my desk is not 100% safe. so that’s right it’s it’s it’s not easy to kind of come to grips with the idea that the answer is maybe and it depends.
Euan: yeah
Linda: it’s not uh you know it’s not not not the story we would like to tell ourselves
Euan: no people don’t like qualifiers
Linda: no and they think it means that they they think it means that the science is untrustworthy. but actually what they’re getting is the best we can do with what we have at the moment. and I did it I wrote an article many years ago about um fair trade uh coffee and clothing and stuff, and the whole idea of fair trade. and one of the people I interviewed said to me you know you you have to make the best decision you can on the data that you have today, knowing for well the data you get tomorrow might show that that was a bad decision. and that’s kind of science
Euan: absolutely
Linda: as well it’s like you have to you have to work with the evidence that you have today even though the evidence that you get tomorrow might be might show that what you did was the wrong thing to do.
Euan: and we’ve seen that obviously with a pandemic that you know people have made decisions um in you know the absence of a lot of data, you know they just don’t know what’s going to happen. and that’s where we exercise what’s called the precautionary principle. and we see that in the environmental world as well. it’s like we don’t know a whole bunch of things but we know that the risk is large. if you know this thing if the if this thing is true, in inverted commas, and we don’t act, then the damage done is large. so then you exercise the precautionary principle and obviously that’s the tragedy of the climate change. you know sort of debate, in inverted commas, is that you know people say “oh but climate scientists are not agreed about climate change” like yeah no they’re agreed that it’s happening, they just argue about to what extent, and again that’s healthy, so yeah that’s what the science is doing, but there’s there’s basically no one. you know, that’s, you know a climate scientist a legitimate client climate scientist, who says climate change is not happening so that’s that’s the difference
Linda: yeah yeah it’s um it’s frustrating to to see that happening and to see the the misuse of of what we know and um so we’ve talked about data mistakes what is your kind of what’s the worst data misuse, deliberate misuse that you’ve seen?
Euan yeah well I mean I guess fortunately and but also hopefully by design I’ve I’ve not been uh aware of data misuse in any of my colleagues and obviously not in my own group, and that’s something that I you know really reinforce. and it’s part of undergraduate training about the ethics, and just the absolutely critical non-negotiable professionalism that you just don’t you know modify your data in any possible way. and that you know there has been some really high profile recent cases including in the environmental field, so to do with acidification of corals and the impact on coral reef fish, um and some behavioral ecology work. and I’m not going to name the scientists because there’s there could be legal action, but they’ve been found um to have potentially engaged allegedly in scientific fraud and and these are really high profile uh pieces of work, you know published in you know big name journals.
Linda: yeah
Euan: um so that that’s sort of I guess the cases that I’m most aware of, that you know you’ve got these data sets that are basically being made up or the implication the allegation is that they’ve been made up, and that’s devastating, because it it uh again as we sort of discussed before it it kind of taints other scientists and and researchers you know. and it feeds that um denialist you know sort of uh narrative “oh you know scientists can’t be trusted they’re just making things up you know so they can all become rich and drive the mercedes benzes and so forth and knowing all my friends, you know, their cars they’re uh they’re certainly not making any money out of climate change so yeah it is a real problem because not only is it just terrible, um because the science clearly is um it’s false but it damages the whole field. and also the people in their labs I think that’s another really important point is that if you’ve collaborated with these people in good faith and then all of a sudden one day you find out that all this research was made up, you lose your publication record overnight. you lose all these things overnight, which can be incredibly damaging to your career prospects through no fault of your own. so yeah
Linda: plus these results tend to linger long after they’ve been debunked I mean you only need to look at autism and vaccines to you know that
Euan: oh gosh yes yeah the Wakefield case, awful
Linda: it was debunked so quickly and yet people are still citing that research and he’s still talking about it
Euan: I know and I think I think the other worry is that I think some fields and data sets would be very easy to manipulate and not be detected, in other fields that would be harder to do. you know so you think about my field as an example, where people are collecting data in the field, you know observations of animals and so forth. in many cases there’s no one out there, very few people. um you know if someone was to fudge the data it would be very hard to find out you know. whereas if you’re obviously in a lab with multiple people and people sort of seeing what’s going on and then people report something and you know someone else in the lab says “oh hang on I know that didn’t happen” but that’s not you know. so that that’s really concerning too that depending on where you are what sort of work you’re doing um that sort of misuse of data and then so forth scientific fraud may or may not be easy you know to undertake which yeah is something that’s obviously really unsettling.
Linda: and the case you were talking about before you said it was they were published in high-profile journals so obviously and they have review processes and obviously the reviewers didn’t pick up any flaws. is there anything you look for are you as an academic presumably you review other people’s papers and stuff are there things that you look for in particular to spot both deliberate misuse but also just you know naive…?
Euan: yeah there’s a couple of things that are really important I think in I guess testing the validity of data. one is can you repeat the experiment and find similar or the same results? so and there’s this big issue out there at the moment called the sort of the crisis of reproducibility of data sets. you know that a scientist will collect something, analyze it, write it up and put it out there, and then another scientist does it and gets a completely different result. now there might be legitimate reasons for that but there might not be. so that’s a clear test for starters is can you reproduce their work or not? and if not you know is there a legitimate reason? so maybe the temperature was different in the experimental setup and something went wrong and you know the animals behaved differently or something or was it something more sinister?
And then the other one is really when I see a data set, if it just looks too good to be true – and particularly in ecology where we’re used to pretty messy data sets, so if I see a statistical relationship or a trend that’s like incredibly strong, and there’s hardly any sort of uncertainty in error and so forth in that data set, immediately my hackles go up. and most of the time of course it’s fine, there’s you know, it seems to check out, and you sort of you read that carefully. you might talk to colleagues and so forth who might have more expertise in a particular area, and saying look, you know does this make sense to you, this relationship? would you expect to see that? and those sort of things. so you sort of do your due diligence and so so that’s the sort of process you go through. but of course it’s really hard to be a hundred percent sure whether something is legitimate or not but yeah certainly when you see really consistent clear relationships and particularly of course if you see them over and over and over again that’s certainly where some alarm bells would be going off for me.
Linda: yeah I that’s that’s a thing that keeps coming up on this podcast that people like, you knowm if it comes out really neatly, if you get a really good result, that’s where you go hang on a minute something’s wrong,
Euan: and particularly in ecology in my area and field-based ecology where you don’t have the power to control and manipulate everything so if you imagine a laboratory setup where you might be able to literally control every single facet of an experiment, so you can literally reduce it to the one factor you’re interested, in? great, but that’s not the real world, that’s you know we want to understand systems in the real world and therefore you have to embrace noise, and there’s a lot of noise in data sets and a lot of uncertainty. and so yeah when you don’t see that in field-based ecological data sets, it’s concerning.
Linda: yeah and that’s again something that seems to be very very absent from the way we teach the use of data, and the analysis of data, is that idea you know teaching it with real data sets where they are messy, and they’re complicated, and they they don’t make as much sense as we’d like them to make. what’s the first question you ask when you look at graphs in the media ?
Euan: what’s the first question I ask uh the first question I ask is probably how the data was collected so what sort of methods were involved. uh so if someone’s again particularly sort of emphasizing a really strong relationship, and that they’ve got really strong evidence for something, you know obviously I’d be saying was there a control? or was there replication? those sort of really basic, you know, boring but really fundamental things in science. because again you know that sort of same old adage of you know correlation it’s not causation and so forth. it’s really important, depending on how forthright you’re being with your language, therefore what evidence you need to really say that.
yeah so I would always be looking at the methods for what someone’s actually done, like how rigorous are those methods? and maybe have they come at this from more than one direction as well, so maybe they did a field experiment sorry a field observational study and they they sort of found something, did they then try and potentially test that in the lab as well? and you know if they can get you know the same sort of patterns emerging from multiple lines of evidence, maybe using different techniques as well, then of course again it becomes more convincing, you know and that’s the rigor of science, and I think the wonder of science you know, is that we have these tools at hand and different approaches that depending on how important something is, we can really sort of you know bring them to bear and and really get at a question to have more you know confidence. and again it obviously comes back to how important this is. obviously if you’re trying to develop the cure for cancer you need to be really sure you know this is this is correct. if it’s something else that you know is far less important, still obviously you need to conduct your work in a in a professional and rigorous way, but you might not need to put as many checks and balances in place because the ramifications of that work are just far less, in terms of if there’s a bit of error and so forth then maybe it’s okay. but when you’re talking about someone’s lifem and the application of a treatment, you want to make sure it’s right. so you know those are the sorts of things we weigh up, I guess, when we’re sort of looking at you know science and and you know how how it’s been conducted and therefore you know how how um how legitimate that may or may not be
Linda: and I love that the open data and open science movements are kind of shifting us to a place where we can you know we can analyze someone else’s data sets and see if we come to the same conclusions and I did see I saw a quote years ago something along the lines of it’s not uh you can’t say you have a result until you’ve got it at least twice like a you know yeah um and and it’s this idea of science as as fallibilism as a way you actually you’re not trying to prove something you’re trying to disprove it
Euan: oh exactly yeah anytime someone says this is proven um there’s a small part of me inside that dies because that’s not how science works.
Linda: yeah every time someone says something is proven a fairy dies it’s it’s it’s this um this and I see it in papers all the time or in people reporting studies going “the object of the study was to show that” I’m like no! that’s
Euan: yeah yeah automatically you’re not being objective if you’re doing that so
Linda: that’s exactly right, if you’re looking for a particular result you’re more likely to find it
Euan: and that’s where that’s where again like these issues we discussed before that’s where you can get into these really you know slippery slopes and danger where you’re looking for a result, you’re searching for a result, and that’s not what you should be doing as a scientist. you should have hypotheses and questions in mind, but then obviously when the data comes back that’s the data and then you obviously infer what you think it’s telling you from that but you certainly don’t go looking to show something.
Linda: yeah yeah you know it’s that um that coming back to what you were saying about outliers before that sometimes we think of faking data sets as being you know people making up whole sets of numbers but it might be as simple as throwing out the outlier that absolutely messes up your nice neat um ahah moment
Euan: you’ve got a regression or something and you’ve got a data point that’s sitting up in the you know the right corner or the left corner all by itself it’s going to change the slope of that line that’s just a fact so
Linda: yeah yeah and I often call myself a professional outlier because I know it doesn’t matter which scale you measure me on I’m probably not not normal and so you know you get people who have reactions to drugs or respond differently in some way or you know don’t do what they’re expected to do when they use pieces of software. I’m really good at breaking software.
Euan: yeah
Linda: and that you know when you when you go well those outliers don’t matter, you’re throwing out the people who aren’t normal, as well as you know
Euan: and not only that you’re throwing out opportunities to learn you know I don’t I don’t work on people I work on non-human animals but but whenever I see outliers I get excited. and I’m like oh yeah okay so this individual is really different what could we potentially learn from them? you know what what does that open up you know so I think yeah when you’re throwing that out you’re actually throwing out huge opportunities to learn, and potentially even the case of people right, potentially have breakthroughs, because like oh this person’s different! and if that’s an advantage is different so something you know that might benefit other people or people in general so you know it’s just a silly sort of notion to sort of you know dismiss outliers
Linda: yeah yeah and it’s like you know the the experiments that go wrong sometimes are the ones that teach us more like penicillin what’s an experiment that went wrong and turned out to be really vital
Euan: um what is it that excites you about data
Euan: I think data is just a sub-section of science in that it’s creative, in the sense that as a scientist we are creating information that we didn’t have before. so you know you might be understanding the day-to-day habits and lives of a dingo, that we just didn’t know before, so I often think about science is quite similar to a artistic pursuit. it’s actually inherently creative, and then you get to use that data to tell stories about what you think is happening. so that’s what I find really inspiring about science and also really profound and powerful, because again you know the area that I work in, you know, I’m passionate about um conserving species and using science to help make better decisions. so um yeah the the joy of just collecting data and finding out about an individual about a species and then being able to use that potentially is really exciting. and and also just I guess opening a window for the public because, you know I’m an ecological nerd, so I find all this stuff really interesting but the public actually often do too you know. They don’t get to have that experience and connection that a lot of ecologists do, because they’re not out and about doing… they’re doing other things and that’s fine. so whenever we find out something super interesting about you know the daily habits of a species or how it’s interacting with others and so forth it’s just really fun. and you know getting to share that sort of that magic if you like, in inverted commas and that information that we didn’t previously have that that’s really that’s really cool.
Linda: yeah it is it’s an opportunity isn’t it to yeah to really make change yeah um I didn’t warn you about this question because I only have been thinking about it the last day or so um and so we can scrub it if you if it’s too much but how do you maintain your optimism because you know we’re seeing at the moment especially oh this new government that was going to be so much better than the old government and is still approving coal and gas like yeah how do you how do you keep going with that?
Euan: it’s deeply frustrating there’s no question about that and it’s upsetting and I don’t think I’m alone, as anyone really because about the environment, of cycling through emotions of frustration that we’re just not listening to the scientists and experts, anger because of what that means and then sadness of the inevitable outcomes of that. you know so I oscillate between all those. but in terms of how do you get up each day and keep going well the short answer to that is what’s the alternative? and the alternative is you just give up and things potentially are far worse. and I think that’s another really interesting thing that sort of data helps us with is um thinking about the counterfactual so even though things are really bad for some species and some species are now extinct, some species are going to become extinct, there’s a whole range of species that have been saved from extinction because of what really passionate scientists, members of the community, have done. you know so there is hope and again data does provide that hope because it again tells you well okay, this is what we’re up against, this is what we think is happening, this is how we might best manage that. and then of course we have to try and hopefully bring community and governments and other decision makers on board to act on that. Unfortunately as you rightly point out and as you’ve seen me rant about on social media and so forth regularly it’s just not happening at the pace and extent that it needs to for climate change, and for addressing the other equal crisis which is biodiversity decline and extinction it’s just not happening. but I just can’t afford to give up. and and you know I genuinely love wildlife and and the environment and nature and now my family does. and you know I want them to hopefully be able to enjoy many of things that I’ve already enjoyed, and you know even things about like the Great Barrier Reef seeing how that’s deteriorated in basically a quarter of a century which is just a little bit more than half my life that’s profound.
so I remember thinking in undergraduate you know I was in Townsville being taught by people like Professor Terry Hughes you know Coral Expert, being taught by Professor Stephen Williams who’s an expert in mammals and birds and reptiles and so forth of the wet tropics of North Queensland, they were both warning me in the mid 90s and early 2000s about what the possible impacts of climate change would be either for coral bleaching and animals having to move up mountains as it got hotter because they’d run out of places to live. and as worrying as I thought that was at the time I also sort of partly thought oh maybe they just being a little bit you know pessimistic and it won’t be that bad. what they said has come to pass and in fact in some cases far quicker and more severely than we would have hoped. so yeah it does show to me, it is really upsetting but also that we can’t give up. and so um and honestly I don’t know what else I would do if I did so. I’m definitely stubborn and incredibly passionate so yeah
but you know on a serious note the vast majority of people I know working in the environmental, conservation, ecological arena definitely suffer from what I would say would be mental health issues. you know across quite severe to you know mild and I’ve certainly had my struggles with that. because if you love something deeply and you understand it deeply, and you tell people you know “these horrible things are happening and this is all you need to do to fix it” over and over and over and over and over again and it doesn’t change, and then what you predict, it happens. that’s really upsetting. and I can only sort of imagine it must be very similar for people working in healthcare and so forth who warn of a disease, including during the pandemic, and how we might tackle that and and they also have this intimate knowledge and they know what to do they’re just not being supported to do it. and then we know the outcomes so you know it’s a similar sort of situation, so yeah it’s a question that I’m often asked and it’s a really difficult one but the short answer is we just can’t afford to give up and when we do get supported and use data and evidence we can achieve one of the things.
Linda: That’s a magnificent note to end on thank you so much this has been a fabulous chat I’ve been looking forward to it for ages and it has been really awesome to have you on thank you so much.
Euan: Thanks Linda.
Linda: Thanks for listening to Make Me Data Literate. You can support the work of the Australian Data Science Education Institute at givenow.com.au/adsei. Tune in next time for more conversations with amazing data experts.
