for W3c validation
Friday Faves is our weekly blog series highlighting a few select pieces from the REG team’s reading lists. You can catch up on past Friday Faves on the archive.
Digital Transformation – How Do You Define It?
Anne says: Workplace terminology shifts to reflect technology advancements. The most recently used term that is used to address how an organisation embraces technologies is digital transformation. But just pause for a moment – what do we mean by transformation, and in fact what are we calling digital? According to Chambers dictionary – a transformation is a marked change in form, nature or appearance. Is this what we imply with digital transformation?
PWC has released its 2017 Global Digital IQ Survey results with an overview in Harvard Business Review. The survey investigates evolving priorities, challenges and how they’re using technology, across 2,000+ executives at companies with an annual revenue of more than US$500 million.
The comparisons with 2007, only 10 years ago, are profound. Here’s a few highlights:
- In 2007 companies focused on data mining, search and virtual collaboration. In 2017 they’re looking at artificial intelligence, machine learning, and the Internet of Things.
- In 2007 there was no mobile strategy.
- The meaning of digital has changed. In 2007, it referred to “IT”. Today, it refers to the roadmap and goals across departments from marketing to HR and including customers.
- In 2007 only 40% of CIOs were involved in strategic planning. Now they are considered one of the most important members of the C-Suite.
However, the average investment on emerging technologies and innovation is down. The spend is the lowest in the recent decade with the purpose of technology initiatives centred on increased revenue and reduced costs.
Herein lies the contradiction: corporate rhetoric of digital transformation versus reduced investment and commitment to digital initiatives.
The key finding aligns with our approach over the past decade: focus on the experience of people if you want a digital initiative to be successful!
Read: https://www.pwc.com/us/en/advisory-services/digital-iq.html
Chatbots Want to Talk About Your Mental Health
Jakkii says: Woebot was released this week through Facebook messenger. A chatbot design based on science (led by Dr Alison Darcy of Stanford), Woebot employs cognitive behavioural therapy (CBT) techniques to aid users in managing their mental health. The chatbot does not use machine learning; rather it uses extensive decision trees to provide responses to user input.
Woebot isn’t the first AI foray in the mental health arena. X2AI deployed a chatbot called Karim to help Syrian refugees, an Arabic-speaking bot itself based on X2AI’s flagship healthcare AI named Tess. Then there’s Nadia, developed in New Zealand by Mark Sagar at the University of Auckland – and voiced by Cate Blanchett – that helps users access the Australian National Disability Insurance Scheme (NDIS).
These uses of AI are particularly fascinating, as they seek to improve human lives rather than simply sell them something, or make them “more productive.” Healthcare is a field rife with opportunities for both technological advances and increasingly widespread use of AI and machine learning, a trend we see played out in the media on a regular basis with stories on the latest advancements. Closing the gap on affordable access to mental healthcare is of significant importance, and one in which chatbots may play a critical role.
It’s not without risks, however. For instance, will a chatbot challenge a patient when needed in the way a human therapist would? What happens when the chatbot detects a need to provide text & phone helpline information and the user can’t afford to seek additional care? Where does ‘this isn’t a doctor and can’t provide you advice’ end and liability begin?
And then, of course, there’s privacy. Woebot is deployed solely on Facebook (though the company hope to raise funds to build a fully standalone app in future). Any disclosures users make to the Woebot aren’t actually anonymous – Facebook knows exactly who they are, and owns and retains that data. It’s a struggle enough to get people aware of their privacy and data collection online and the possible implications. Are vulnerable people who need access to ‘someone’ to listen to them, to help them, likely to carefully consider their privacy when interacting with a chatbot on Facebook if it offers them exactly what they’re looking for at an affordable price, while offering an illusion of safety?
These concerns are valid, and must be considered by designers and users alike. Yet, there is much potential. Though Chatbots such as Woebot remain unproven, if the bets these companies are making on AI in mental healthcare prove accurate, they may just do a great many people a world of good.
Read: http://mashable.com/2017/06/08/mental-health-chatbots
The Secret Social Media Lives of Teenagers
“(We) need to shift the conversation around teens’ social media use away from a fear of getting caught and more toward healthy socialization, effective self-regulation and overall safety.”
Emilio says: Oh the fascinating workings of social media and the even more fascinating behaviour of young people using it to earn their friends’ approval.
The dangers of reckless online behaviour were once again brought to the fore when a handful of students recently had the Harvard degree of their dreams shattered – all because of distasteful posts they had made to a private Facebook group. Closer to home and quite recently in Australia, a man had shamelessly posted a live video of himself performing a sex act with a partner whilst berating her and mocking her body shape in his caption to a private all-male Facebook audience.
The propensity amongst young people to muck around and make a fool of themselves and of others on social media is nothing new. What’s new and quite revealing to me is the biological connection established in this piece about young people and reckless social media behaviour. According to the study, teens’ largely undeveloped part of their brain, responsible for logic and reason, can lead to impulsive behaviour, and when combined with the need for validation, it is a recipe for damaging online footprints – ones that could come back and bite them for life.
The golden rule for people on social media, young and old, continues to be about being careful, thoughtful and conducting ourselves in a manner that we would when interacting with other people in real life. Because no matter how private and safe we think the things that we share with our close friends are, nothing is ever private on the internet.
Read: https://www.nytimes.com/2017/06/07/well/family/the-secret-social-media-lives-of-teenagers.html
The Existentialism of GPS
“Geographic existentialism, wherein even our instruments are lost, is just one intriguing side effect of relying on contemporary satellite technology for a sense of real-time location.”
Nat says: I found this article a fascinating read in terms of technology, ‘truth’ and existentialism. It seems that mankind’s quest for answering ‘what is our place in the Universe?’ is something being embodied by the man-made instruments we make as an attempt to answer such a question. Ironically, such instrumentation is also lost in its pursuit of pin-pointing a concrete answer (and in this case – an exact location) for us.
Although the article points to errors in GPS data capturing, there is a much broader argument being raised as a result. Do we, without question, accept the “data” that technology provides us? I would argue yes. Technology is something we so readily take for granted in our daily lives that we often forget that it is designed and programmed by people – and people are fumbling, emotional and irrational beings who make errors. Such ‘errors’ go into technology. We often have so much blind faith and trust in the elusive and all-knowing ‘technology’, that we fail to question its origins and subsequently its accuracy.
This is essentially what the article alludes to – reminding us that critical thinking and technology need to go hand-in-hand. How many stories of blind faith do we need to hear (i.e. someone driving into a lake because their GPS told them to) before we start thinking about the uses and limitations of tech? Especially when the example given in the article is a GPS satellite recording data that suggested a static building had moved!
Read: https://www.theatlantic.com/technology/archive/2016/06/gps-goes-adrift/487334/
Foxtel’s Streaming Service Rebranded, Significantly Cheaper
Joel says: Foxtel has been delivering pay TV content into our homes for over 20 years now. But with the recent arrival of streaming services Netflix and Stan, it has been forced to shift it’s approach, branding and distribution model.
Foxtel Play has now been rebranded as Foxtel Now, trying to reinforce the message of ‘on demand’ to compete better with the other big streaming platforms and increase the amount of homes in Australia that currently subscribe to the Foxtel product (sitting around 30%).
At launch, Foxtel claims there will 16,000 titles available on demand (depending on your subscription package) and half of those will be available in high definition (720p). You’ll be able to stream Foxtel channels for as little as $10 a month. Other packs will range between $10-$29 per month with Australia’s most downloaded show in 2016, Game of Thrones, available in the $15 per month Drama pack.
The new Foxtel Now app launched on Wednesday and is available now for iOS and Android devices as well as PC, Mac and Telstra TV.
Read: http://press-start.com.au/news/2017/06/06/foxtel-play-rebranded-made-significantly-cheaper/
for W3c validation
[…] Last year we discussed some of the chatbot apps being developed to help with mental health. This therefore comes as little surprise, however I do wonder about privacy here, as well as the potential impact on those with workplace-based health insurance who choose to participate (such as in the US). Could employers be putting themselves at risk of liability? Perhaps that depends on the sophistication of the chatbot (noting of course that most are, at this stage, quite simplistic), and ultimately may require some education of their workforce. […]