Critical Thinking – Recognising Dodgy Arguments

Print Friendly

stylised silhouette heads, upper head looking displeased listening to devil, lower head looking pleased listening to angel

Critical Thinking is a set of six short, animated videos by Australian foresight agency Bridge8, which created the series for technyou, an emerging technologies public information resource funded by the Australian Government.

Originally designed as a teaching resource for secondary school students, this visually appealing series will be useful to change agents both personally, and in any teaching, coaching or training they may do.

The animations explain key concepts in clear and easily understandable ways (‘logic is a way to combine ideas to come to a conclusion. It’s like maths only it can deal with more than numbers’), with affectionate touches of humour.

Associated transcripts are also available for each video, and a colourful Recognising Dodgy Arguments companion guide, available as postcard-sized or  as an extended version (both pdf) which can be downloaded.

Part 1: A Valuable Argument

Part 1 is about how the human brain takes shortcuts to help us deal with the complexity in our world. But there are times when we need to be careful of letting our ‘shortcuts’ and our biases substitute for deeper thinking. Logic is a tool for helping our thinking processes in order to identify ideas that may be helpful.

Logic is a useful way to combine established ideas to support the acceptance of a new idea. Looking for logic in an argument can help you decide whether you should agree with somebody, or wait for more information.

Transcript (pdf)

 

Part 2: Broken Logic

Part 2 is about the structure of a logical argument, and how to distinguish between a logical argument and a logical fallacy. It uses the structure of a maths equation to explain premises (something we already know or agree upon) and conclusions that can be drawn from combinations of premises.

It’s easy to mistake a logical fallacy for the real deal if you’re not careful. People do it all the time. Sometimes by accident and sometimes to fool you. Knowing the structure of a logical argument is important.

Transcript (pdf)

 

Part 3: The Man Who Was Made of Straw

Part 3 is about recognising a ‘straw-man’ – a misleading characterisation of an argument – created either by those opposing you, or even unwittingly made by yourself. Refuting a straw man argument means you have not, in fact, defended your argument, but been drawn into defeating something else altogether.

Logic is built up of ideas called premises. Even if they seem logical, it’s important to pay attention to those premises to make sure that they’re not made of straw.

Transcript (pdf)

 

Part 4: Getting Personal

Part 4 is about making the distinction between the ‘messenger’ and ‘the message, and not confusing how you feel about someone with whether you trust what they have to say.

It’s hard to listen to people we don’t like, and difficult to disagree with those that we trust and admire. But there’s a difference between who a person is and what they’re saying.

Transcript (pdf)

 

Part 5: The Gambler’s Fallacy

This part is about how we assume that the probability of something happening is conditioned by past results.

Just because one thing follows another, even if it happens a few times, does not necessarily mean that they’re linked. There could be other factors, or it could simply be coincidence.

Transcript (pdf)

 

Part 6: A Precautionary Tale

This part is about the precautionary principle, and how although avoiding action until aware of adverse consequences is sensible, it is impossible to remove all risks associated with every action.

…waiting for irrefutable data, which is logically impossible, is a bad way to make decisions…Asking about risks is sensible. But demanding one hundred percent safety stops technology from evolving.

Transcript (pdf)

 

Note: while I appreciates the point of the argument in the last clip – that irrefutable data and a 100% guarantee of safety is logically impossible – we owe it to ourselves and all life to take a ‘mission critical’ approach to our planet as we do with aircraft or any other technology. As ex-RAAF and Boeing engineer Andrew ‘Wilf’ Wilford put it:

If we consider our entire planet as a safety and mission critical system, how sophisticated should risk management approaches be for such important issues as accelerating climate instability, energy security, ecosystem vulnerability, and resource depletion, among other issues?  Wouldn’t it make sense to apply similar precautions?

At the core of effective risk management is the realization that just because something hasn’t happened before, it doesn’t mean that it won’t happen in the future. So, if the consequences of failure (i.e. in runaway climate change) are catastrophic, then it’s appropriate to rapidly and effectively intervene to reduce the likelihood of such an outcome.

Just as not being 100% certain of safety is not a reason to ground all aircraft, the lack of irrefutable data is not  a sufficient reason to abandon the precautionary principle – a central tenet of sustainability – which is about making decisions that do not pose a threat to people and nature, even if that means we forego some opportunities.

Have you ever been in the middle of a debate and realised that you’ve been sidetracked by faulty logic or a straw man? 

How can we decide whether the precautionary principle should be invoked, given that it is logically impossible to have 100% certainty?

If you’d like to get Cruxcatalyst via email, click here to subscribe to this blog.

If you liked this post, please consider sharing it using the buttons below.

Listen

Trackbacks

  1. […] to spot and defuse faulty arguments and reasoning in Critical Thinking – Recognising Dodgy Arguments, Know Your Logical Fallacies and a Visual Guide to Cognitive […]

Speak Your Mind

*