Content warning: this week, some articles may contain distressing content or images, particularly in relation to the events in Christchurch.

Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch up on past Friday Faves on the archive

The role of algorithms and why we’re asking the wrong questions about social networks after New Zealand attacks

Anne says: Sometimes it takes tragic events to amplify issues that we’ve been vaguely aware of. This week was one of those times.

The horrific tragedy in Christchurch, NZ, streamed live on Facebook has amplified the discussions of sharing via social media and user behaviour (why would we want to watch innocent people being executed? And why would we repost and share it? That’s another blog post, for another time). But underlying this behaviour is something I find more ominous – it’s how algorithms, AI, are being used to amplify our behaviours.

The first article from Scientific American was something I was aware of – myself and other members of the team had written about similar themes previously – particularly works by Tristan Harris. While highlighting the wonderful aspect of YouTube, people creating videos and sharing their experiences, particularly popular with the DIY, fix-it-yourself set – there’s a dark side. Have you noticed how immediately after you finish watching the video you searched for, another is about to start? It’s a recommender system, powered by artificial intelligence (AI). But unlike other recommender systems (if you liked this content, you’re likely to find this interesting/valuable etc), the YouTube algorithm is pushing content it “deems as engaging” – and here comes the dark side: engaging content can be very dark…

According to the article, Google (they own YouTube, in case you didn’t know) has a model that incentivises content that is watched the most. And now, think about what happened in New Zealand… The second article, an opinion piece from the New York Times, highlights this structural problem.

“…the graphic, high-definition video of the attack was uploaded by users 1.5 million times in the first 24 hours. Of those 1.5 million copies of the video, Facebook’s automatic detection systems automatically blocked 1.2 million. That left roughly 300,000 copies ricocheting around the platform to be viewed, liked, shared and commented on by Facebook’s more than two billion users.”

Facebook has been quite open about their attempts to stick a finger in the dyke but with figures like this, it’s almost impossible to avoid leakage. Meanwhile over at YouTube:

“YouTube took “unprecedented steps” to stanch the flow of copies of the video that were mirrored, re-uploaded and, in some cases, repackaged and edited to elude moderation filters.”

So we’re not just dealing with the raw materials, we’re also now dealing with re-packaged, re-purposed materials. And the more hits, the more the algorithm pushes through the “engaging” content. However, the key point highlighted in this article is where we need to focus our attention:

“The horror of the New Zealand massacre should be a wake-up call for Big Tech and an occasion to interrogate the architecture of social networks that incentivize and reward the creation of extremist communities and content.”

Exactly!! This is the issue – the type of content and users that are playing the algorithm to distribute their extremist views. The author talks about the need for the big social media platforms to take responsibility. But we – as the people who are being pushed content AND watching it AND sharing it – need to take a role in bringing these platforms to account and demanding changes to their incentivised algorithm systems.

And take a lesson from Jacinta Ardern, New Zealand’s Prime Minister, don’t watch it, don’t encourage them and please don’t share it.


Why companies want to mine the secrets in your voice

Helen says: We have witnessed rapid development in the area of voice technology and its application is now prevalent in many homes thanks to Alexa, Google Home and smartphones. We can switch on lights, change a tune or phone a friend, all from a voice command. Now with the aid of voice mining technology, some companies are looking beyond simple voice capture and into more detailed voice analysis, and their research is demonstrating how the voice can reveal a surprising level of information about ourselves, and not through words – our voice is a rich data set. By using algorithms and AI to analyse complex voice features and speech patterns and continually learning from this, it is conceivable that during a real-time conversation, our voice could be monitored, and we be evaluated for anything from having a particular disease, suffering from stress or depression, whether we are likely to default on a loan or what percentage risk you are of leaving your job.

The potential for some really good outcomes in the area of mental health management are noted and I am sure there will be many other positive applications possible but what measures will be taken to ensure that by simply using my voice I am not giving away personal information without my knowledge? How will I know what information has been collected about me, whether it is true and what it will be used for? When it comes to the application of other AI technology and the use of algorithms, these questions are not new, but I would like to learn more about what is being done to address them.


How Will Google Overcome Stadia’s Biggest Obstacle?

Joel says: In case you hadn’t heard yet, earlier this week at the Games Developers Conference in San Francisco Google revealed their new game streaming platform titled Stadia. Think of it as a Netflix service for games. Stadia will allow any person with access to a device that can use Google Chrome to stream and play games at 60 frames per second and at 4K resolution – if you have an internet connection that can handle the bandwidth required.

Gone may be the days of physically buying a game on disc and installing it on your home console to play. Users of Stadia will be able to play the latest games with no install times streamed straight from Google servers right to your browser or Chromecast Ultra. Well, that’s how it would work in an ideal world. But it seems this Gizmodo article tackles the biggest issue people will have with Stadia: the internet requirements needed to get the best experience out of the service.

The problem with a gaming platform that relies entirely on the internet is that it relies on the internet.

The article mentions that even the journalists at the demo event for Stadia were having trouble getting the most out of the system due to the convention centre not meeting the internet requirements in a stable capacity. And although the piece is written from a US perspective, the same issues will be very much present for us here in Australia – perhaps even more so. Google released requirements for Stadia stating it would require a connection of at least 25mbps to stream in 4K at 60fps meaning unless you have an NBN or decent cable connection here you’re going into the experience with a handicap.

Google have confirmed since the presentation that Stadia will begin its roll-out later this year in “US, Canada, UK and most of Europe” and confirmed that Australia will not be getting the service initially.

While I love the idea of the service, unfortunately, the realisation is that much of the world may not be ready for Stadia quite yet. The US and Australian internet infrastructure are not yet up to scratch to ensure everyone that buys it gets a quality experience with reasonable resolutions and minimal latency for controller input. I do believe an ‘all-digital’ future for games is the future, but I think it’s the far future. Unfortunately for Google, we may be playing catch up when it comes to Stadia here in Australia.

If you have a spare hour and want to check out the full Google Stadia presentation you can do so by watching the video, above. The tech is certainly impressive.


5 steps to conquer impostor syndrome

Jakkii says: Ah, imposter syndrome, you sneaky little devil. I know, practically speaking, that I am not alone in experiencing bouts of any one of these five types of imposter syndrome, but there are times when it definitely feels like you’re the only one who has ever felt that way because you are, after all, the only imposter. What we let our minds tell us is, objectively, really quite ridiculous – and destructive – at times, and yet, here we mostly all are, navigating a variety of negative thoughts and behaviours we’ve somehow developed, learned and trained ourselves into – whether via nature, ‘nurture’, or both.

If you, like me, are inclined to an occasional bout of imposter syndrome – and statistically it is quite likely, given one study found 2/3 of women in the UK have suffered from it at work (and are 18% more likely than men to experience it), while another finding instead suggests that men are more likely to experience imposter syndrome than women – then have I got an article for you! Whoever you are and however you suffer from thinking you’re not good enough, this piece in Quartz looks at 5 steps to conquer imposter syndrome. Woohoo!

  1. Get your validation from within
  2. Give yourself permission to say no
  3. Remember, others are not thinking about you as much as you think they are
  4. Have empathy toward others who also may not feel good enough
  5. Let yourself be vulnerable

The whole article is worth a read as it expands on each of the steps. What do you think, my fellow imposters – do these sound like they could be effective for you?


This week in social media

Politics, democracy and regulation

Privacy and data

Society and culture

Cybersecurity and safety

Christchurch and hate speech

Moderation and misinformation

Marketing and advertising


Sydney Business Insights – The Future This Week Podcast

This weekbig tech breakup, Amazon power and a Facebook rethink. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

00:45 – Breaking up big tech

16:41 – Zuckerberg rethinks Facebook’s privacy

23:08 – Amazon’s platform power

29:11 – Robot of the week: The Guardian

The stories this week: 

Elizabeth Warren wants to break up Big Tech

A privacy-focused vision for social networking from Mark Zuckerberg

Amazon ousted thousands of merchants with no notice

Other stories we bring up: 

Our previous discussion of big tech and antitrust legislation

MIT on how Zuckerberg’s new privacy essay shows why Facebook needs to be broken up

What Mark Zuckerberg did and didn’t say

Zuckerberg’s announcement is just a stunt, says former mentor

Facebook isn’t Facebook’s future

Notes on Elizabeth Warren’s proposal

Our previous episode on face recognition for fish

Robot of the week:

The Guardian – the fish-zapping robot hunting lionfish 


Leave a Reply

Your email address will not be published.

You may also like

image of an office with a laptop open on the desk. on the laptop screen is a videoconference in progress showing many faces in a gallery view

2 years ago

Friday Faves – What We’re Reading This Week

Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch…

Read more
a pile of dark grey question marks with one bright yellow and one bright blue question mark standing out from the pile, depicting better questions

2 years ago

Friday Faves – What We’re Reading This Week

Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch…

Read more
series of yellow characters on top of yellow rods, focusing on a smiling character

2 years ago

Friday Faves – What We’re Reading This Week

Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch…

Read more
Photo of a crowd of people dressed in business casual walking in a city setting, focusing on a blonde woman in the centre

2 years ago

Friday Faves – What We’re Reading This Week

Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch…

Read more