for W3c validation
Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch up on past Friday Faves on the archive.
What is a robot anyway?
Anne says: Currently, in the mainstream media, there are seemingly endless articles about robots. In particular, how robots will take our jobs, how they’re going to kill us (eventually) and on and on and on…
Personally, I’ve always been rather intrigued by robots. It possibly all started with the robot in Lost in Space (“Danger Will Robinson, Danger”) – the kind of companion that helps me stay out of trouble. But then there was Hal in 2001 (the movie), when Hal says: “I’m sorry Dave, I’m afraid I can’t do that”… that was spooky… what if my companion robot turned against me? More recently, the aged care robot in Robot & Frank reaffirmed my desire for a robot companion.
Many of our contemporary perspectives of robots have been shaped by the entertainment industry – movies, TV shows etc. But how useful are these perspectives when we’re considering the now accepted position of working alongside robots?
Over the next couple of weeks I’m going to share some of my robot readings and attempt to reframe my perspective to become more realistic – less movie influenced!
For starters – let’s just check what a robot is anyway? A recent article published by Wired magazine unpacks the attributes that define a robot. But – apparently, it’s not that straightforward. A number of robotocists provide their views – which include aspects such as:
- artificially intelligent agent;
- a physical machine that’s programmed to execute tasks autonomously by itself;
- a system that exhibits ‘complex’ behaviour and includes sensing and actuation.
If the experts can’t agree on a single definition, we start to uncover where some of the confusion arises. The Wired article shifts to autonomous behaviours with differing degrees of autonomous intelligence. It appears that we don’t consider a robot if there’s any degree of human intervention required to operate the machine.
To close out the definition dilemma, we’re left with:
A robot is a machine that senses and acts on its world.
How useful is this when we’re considering our future workplaces shared with robots? I’m not convinced. However, it’s a start and highlights the need for us to be flexible with our mindsets as technology and advances in robotics progress.
There are a series of videos embedded at the beginning of the article that provide some insights into the current state of robot development – worth watching before you read anymore articles that tell us robots will be taking over the world!
Read: https://www.wired.com/story/what-is-a-robot/
The AI Doctor will see you now
Nat says: The Human Diagnosis Project, also known as Human Dx, aims to identify community health symptoms long before people become aware of them and visit their doctor. Inspired by open projects like Wikipedia, Human Dx incorporates the knowledge of thousands of doctors mixed with machine learning practices. The goal is to gather collective intelligence from experts and apply data analysis to reveal medical insights. As stated:
As with Wikipedia, people can enter relevant information for Human Dx, but in this case it’s the global medical community who are invited to submit clinical case contributions and knowledge. With medical practitioners, residents and students providing input on clinical cases for the project, Human Dx can use machine learning to automatically contextualise their decisions and individual clinical insights.
Instead of Googling your symptoms or thinking you are dying when you visit WebMD (which reminds me of the below Tweet), Human Dx uses collective insights based on real cases. The project is said to empower people to recognise their symptoms, but to also know when and where to seek help for treatment.
EMINEM: his palms are sweaty, knees weak, arms are heavy
WEB MD: cancer— Bea_ker (@bea_ker) September 1, 2017
Technology’s role in the healthcare industry is not anything new, as we already have cancer-spotting AI algorithms, and things like CRISPR which looks into gene and bio-hacking. The evolution of technology makes it more integrated into our lives and into our skins. One day we’ll have technology telling us when to go to the doctor based on its bodily recordings of us, and the doctor themselves will also likely be a machine. Terrifying or exciting times? One must assume that such technology is trying to prolong our lives, but is this out of a desire to live a long and happy life, or does it stem from a fear of death? It was not that long ago that a Dutch Study claimed that the maximum human lifespan can only reach 115 years. To live any longer, we’d have to become part machine, which is arguably where we are headed!
Read: https://www.redbull.com/int-en/human-diagnosis-project-ai-doctor
Australia to launch beach-protecting, AI-powered shark drones
Joel says: We’ve had drones on the brain a bit lately. A couple of weeks ago I wrote about how there would be 1 billion drones in the world by 2030. It seems we’ve found a new use for drones in everyday life right here in Australia.
Australia is famous for a few things: sunshine, excellent beaches and a huge array of deadly animals that will bite/poison/sting you at a moment’s notice. With that in mind, a group of researchers have created a new shark-detecting drone capable of finding the apex predators underwater, quicker than the human eye and with a higher level of accuracy.
The technology, known as SharkSpotter, uses an algorithm to detect sharks in a live video feed recorded in real time by a drone (known as the Little Ripper Lifesaver) flying above the water. Using a world-first algorithm, developed using artificial intelligence and deep neural networks, SharkSpotter is able to distinguish sharks from dolphins, rays and other marine animals, and even surfers. Thanks to an onboard megaphone, the drone can also warn swimmers about what’s lurking in the water before they’ve even seen the threat.
The Empathetic Dog
Whoopi says: As a member of the Ripple Effect team, I have the opportunity to observe their work from a different perspective. I hear them talk a lot about developing empathy with people in the workplace as a critical element of their design approach. They say without this type of deep understanding of how someone is experiencing a particular context, they may make flawed assumptions about their needs and ways of working.
I find this intriguing! As a companion dog, I scored off the Richter scale in the empathy test with Dr Brian Hare in the Dognition Assessment. So it made me ponder why the people in my team struggled with the need to convince clients how important creating personas to guide design projects was. For me, empathy and connection with people helps me determine what I can do to provide assistance.
The article I’m contributing this week illustrates exactly how we, companion dogs, use empathy.
Now, imagine if you had an empathy dog in your workplace. Imagine if we could identify what was making you anxious and how to help you manage those situations. Wouldn’t your workplace become more engaging, more fulfilling?
Thanks for listening – I’m hoping to contribute more frequently to Friday Faves. Let me know if there are any topics you’d like me to address.
Read: The Empathetic Dog https://www.nytimes.com/2017/06/04/well/family/the-empathetic-dog.html
How Reading Rewires Your Brain for More Intelligence and Empathy
Jakkii says: Are you a reader?
I love to read. I have always loved reading, and as a kid I always had my head in a book – so often that there were occasions I’d bump into street posts because I was reading while walking, or be so absorbed in the world unfolding in the book I’d miss my school bus – even with everyone yelling out to me that the bus was here and I should get on board (mum was less than impressed when she had to come and pick me up from school).
With that in mind, I found it quite affirming to read this piece on something I’ve always felt to be true – that books, through their ability to transport you to other places, show you other perspectives, to help you feel what the characters feel, are a great vehicle for empathy. Though this article emphasises the role of fiction, in my opinion it not just fiction that can help us: many talented non-fiction writers do an amazing job telling the story of their chosen subject, taking readers on a journey and expanding our minds with new or different ideas, challenging us, helping us learn and grow.
But it turns out it’s not just a sense we might have about what reading can do for us: various studies show the impact reading has on our minds (explored, for instance, in this earlier piece from The Guardian, Can Reading Make Us Smarter?). Reading helps us with emotional intelligence, without which we struggle with empathy. The author tells us:
Novel reading is a great way to practice being human… As you dive deeper into Rabbit Angstrom’s follies or Jason Taylor coming of age, you not only feel their pain and joy. You actually experience it.
This sense of experiencing the pain and joy of others is a critical aspect of empathy. When conducting empathy exercises, many times we see people put themselves in the situation and imagine how they themselves would react. Instead, what we are aiming to do is feel the experience from the perspective of the other – how would the other person feel? What would they be thinking? Would might they say or do in that situation? Through well-crafted characters, settings and stories, fiction helps us exercise this skill and practice empathy – seeing the world through the perspective of another.
The Silent Film Returns — on Social Media
“It’s striking that with all of the technological advances that have allowed us to shoot and share video instantly, we’ve returned to some of film’s most original instincts.”
Emilio says: In the current ‘pivot to video’ phase of social media where video content is king, an uncanny renaissance is taking place. Short, muted, and often silly videos – mimicking the silent slapstick clips that preceded full-length black and white cinema – are now amongst the most watched and most shared content on dominant social platforms like Facebook.
A large chunk of the video consumed on social media is watched silently, claims video analytics provider Tubular Labs. It says nearly half of all brand video views on Facebook are without sound or have only background music.
Driving that point further, The New Times this week published a story which makes for a fascinating read. The theory as to why short, silent video clips are consumed and shared the most on social media seems to support how we browse our favourite social platforms on mobile and our diminishing attention spans. Put simply, our penchant for scrolling down our Facebook, Instagram and Twitter feeds means muted videos provide less disturbance. Otherwise, the sheer noise from the succession of videos playing would annoy the hell out of us. As well, those types of animated videos are well-suited for mobile viewing, even without audio narrative and context. The funny yet witless memes on Buzzfeed and the buffonery fare on LADbible offer a grand showcase.
Want to see an example of the similarities between the kind of videos that go viral today and the hit animated clips of yesteryears? Check out this cat video by French cinematographer and pioneer filmmaker Louis Lumière circa 1897. The undisputed purr-y stars of the internet sure were making waves even in the olden times. Now, watch this 2016 video of two cats fighting over milk, a condensed version of which has been uploaded and passed around via Giphy.
Isn’t it interesting that in the social video of modern times, the old is new!
With this, I am silently signing off from Friday Faves. I hope that you were somehow informed, entertained or provoked in thought by my weekly musings on social media, technology and the often fascinating, sometimes intriguing, but always wonderful world wide web underpinning our digital lives.
Read: https://www.nytimes.com/2017/09/13/movies/silent-film-youtube-videos.html
for W3c validation
[…] says: A few weeks ago I shared a piece that explored how reading can improve our empathy. Still considering what to add […]
for W3c validation
[…] combat fake news; to expose the army of bots that could be used for propaganda; to adapt to the ‘pivot to video’ phenomenon; to understand the impact of dark social and more. And we’re still […]