0
X

You have no items in cart

Axi_Master_logos_Horizontal_RGB-2020-01

Academy Xi Blog

Combatting Digital Imposter Syndrome

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter

“Who can I ask about this system?”

“Everyone seems to know how to use this program except me.”

“I am going to need help – but am embarrassed to ask.”

Sound familiar? The struggle of Digital Imposter Syndrome is real. We’ve all felt it at some point in our personal or professional lives.

Researchers say that up to 70% of people suffer from some form of Imposter Syndrome at one point or another. The fact that none of us are immune to it doesn’t make it any less destructive to our confidence.

The Digital version relates to the way people feel about themselves when interacting with programs, platforms, systems, data, emerging tech etc. One study conducted by Chapman University found that tech-related fears were the second most common fear category amongst adults. 

How often have you encountered a new piece of technology in the workplace and held back from asking the necessary questions to truly understand how it works? You held your tongue. Without this knowledge you won’t know how to leverage its potential. Or your own. Think of that productivity lag. This is how a seemingly innocuous mindset can collectively hold up business growth.

So what can businesses do?

Start by listening.

The starting point for addressing fear is to understand it. Listen to your people. What are you hearing to be the major barriers for them? What are the big pain points? How exactly are they struggling to interact with specific tech? Where do they feel that they fit into the broader digital ecosystem? To quote Zendesk, “Digital transformation can be a rough employee experience”, but it doesn’t have to be when you know what issues are of greatest concern.” 

Take your people on the journey. 

In one of our recent blogs, we discuss that when rolling out digital transformation projects, it is critical to bring your people along for the ride. It all starts with demystification. Bring the technology that underpins your organisation out into the light. To capture the full potential of any single piece of digital technology, your people need to:

  • a) Know how to use it
  • b) Know who to turn to when they have questions and
  • c) Understand where it fits into the broader organisational digital ecosystem

Mx Taîss Quartápa, a senior manager at Accenture, says ‘When I was managing the graduates at a previous workplace, I saw various versions of this [digital imposter syndrome]. The reactions I got really depended on the individual but would range from “No, I can’t do that, I haven’t been trained on that” to “I have only learnt wireframing on x software, not on y software” to “I’m just faking it – I google things and copy what to do – I’m sure that won’t always work”.

Pictured: Taîss Quartápa

Provide the right training at the right time.

All organisational digital transformation is progressive. As a company, you need to crawl before you walk and walk before you run (digitally speaking). This same concept trickles down into the way you support your people. It is about giving everyone the right education at the right time. There is no use rolling out a training program for a new system if they don’t understand where it fits, how it impacts them or why the company has chosen to go in this direction in the first place.

Quartápa says that it is a matter of reframing. ‘If we reframe imposter syndrome to imposter experience perhaps we would evaluate it with a more objective lense.’ They believe self-doubt is normal, especially when venturing into the unknown and challenging territory — something they believe we continue to do daily as part of the very definition of our roles.

Alleviate anxiety through cultural change. 

Businesses need their people to be courageous. The message needs to be: “Let’s prioritise taking the action we need to achieve our goals over looking foolish and feeling fearful”. This is obviously a mindset shift in a working environment where ‘not knowing’ is often seen as weakness. 

In their teams, Quartápa says that “a willingness to ask for help is a key attribute to their success. I expect them to be curious, to keep trying new things and that I know that can be really hard, especially when you’re new and trying to demonstrate competence early. I know everyone says “there are no stupid questions” but there actually is such thing as a so-called stupid question. It’s the question that you could have found the answer to on your own. So, read books, ask peers or google it – there’s no wrong way to admit that you do not know everything, and that you are willing to learn more.”

Our opinion? We believe that the best possible place to start with this change is at the top. Consider how powerful it would be for your Executive Team to declare “We aren’t ‘digital natives’ and could use some help to adapt to these new ways of working.” Talk about courage.

If ever there was a time to support people struggling with Digital Imposter Syndrome it is in the wake of COVID-19. We all now have a much greater reliance on technology. In a time where we are expected to use digital skills intensely across various parts of our jobs, organisations need to be aware of the additional strain this can place on their people.

Want to give your teams a boost in digital confidence?

We have training solutions for every stage of the digital journey. From Intro Courses (1 day) to larger digital transformation programs, we can help. Discuss your digital training needs with us. 

Academy Xi Blog

The Future of BodyTech: Will We Become Cyborgs

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter

If you haven’t been living under a rock, it’s likely that you’ve heard the news — That Startup Show is back for 2018. As Australia’s number one show about startup culture, each episode focuses on emerging technologies such as SpaceTech, BodyTech, robotics, and the Internet of Things (IoT).

In the latest episode, ‘BodyTech —  Turning Brilliant Bio Ideas into Booming Business’, Academy Xi’s very own Frank Guzman volunteered his body in the name of science — by inserting an NFC chip into his left hand. A Near-Field Communication (NFC) chip can be programmed to perform a variety of simple tasks including playing a song or even opening a door.

Revolutionising the relationship between technology and our bodies is big business — with the health and wellness industry predicted to be worth almost US $4 trillion. Global entrepreneurs continue to look for ways to innovate in health technologies, such as CRISPR; an open source gene modification platform that provides new possibilities on the diagnosis and treatment of genetic diseases.

When developing BodyTech products, companies employ User Experience (UX) Design tools to understand a user’s interests, behaviours, and motivations. For example, both soft and hard exoskeleton technologies are currently being developed to assist paraplegics or those with mobility issues walk again. Similarly, by understanding a user’s pain points and needs, exoskeletons could be designed to reduce the load of manually intensive labour such as work on a construction site.  

At Academy Xi, we strive to transform the world through education — through our short courses in emerging technology and design such as User Experience (UX) DesignService Design, or Mixed Reality Design. We believe that in a world of constant disruption and exponential technologies, people can enhance their skills and employability through investing in high-quality, digital skills of the future.

As a proud partner of That Startup Show, Academy Xi is currently giving away two free tickets to each live filming of That Startup Show. Simply Tweet us and tell us why you’d like to attend the show. Be on the lookout — we’ll contact ticket winners each week.

So what did Frank get programmed onto his NFC chip? The ability to play Rick Astley’s ‘Never Gonna Give You Up‘ on command. #rickrolling

You can watch the latest BodyTech episode of That Startup Show here.

Academy Xi Blog

Self Identity in the Post-Work Era

By Charbel Zeaiter

Share on facebook
Share on linkedin
Share on twitter

On the back of my presentation at Future Assembly, we’re contemplating more and more the potential effects of AI as well as the massive, inevitable changes that will result from having to rethink who we are when work may no longer be the predominant way in which we identify ourselves.

The idea that one day we would have smart machines has been floated for centuries. We have grown up hearing Sci-Fi stories that are optimistic about our artificial future, as well as those imagining the world as a futuristic wasteland where humans are no longer required. Technology has finally reached a point where ideas from these stories are coming to life and it’s now becoming important for us to understand the flow on effects these changes are going to have on life as we know it.

Robots are becoming increasingly present in the workplace. Robots like Baxter, brainchild of Rodney Brooks of Rethink Robotics (formerly MIT), is marketed as being able to complete “the monotonous tasks that free up your skilled human labor”. Just this week Hitachi announced that they are introducing intelligent robots to supervise and manage their employees. The robots ‘hired’ by Hitachi may be able to manage employees on the floor, and boost productivity, but at what cost?

Management at its core has a core human element that focuses on motivating, leading, relationships and empathy. What will the fallout be for the human workers without this human element, and how long will it be before the robot supervisor is supervising a completely automated workforce?

Sure we can design intelligence, but what about emotional intelligence? One Japanese company thinks they can. Aldebaran, has created Pepper, the social robot, who was designed to live with humans. Pepper is described as being “a companion able to communicate with you through the most intuitive interface we know: voice, touch and emotions.” Pepper is able to understand someone’s emotional state by analysing their facial expressions, body language, and word choices, but is that really authentic emotional intelligence? How much deeper is it?

Recognising someones mood is one thing, are we ever going to be able to design empathy? Are we ever going to be able to design a program so complex that it can genuinely understand and help its human counterpart navigate their human emotions? And really, is designing emotional intelligence something we should do? Do we really need robots that are emotionally intelligent?

There is another question that we have to ask ourselves when we start imagining this modern, completely autonomous workforce. When the world’s production capability reaches 100% continual production – what are they producing? Who is it for? And honestly, how much stuff do we actually need? With our current production output doing irreversible, and downright devastating damage to the planet, it begs the question, why are we really pursuing this kind of technology. Will our changing attitudes towards sustainability and protecting the planet be in alignment with the attitudes of organisations whose production lines will no longer be limited by the output capacity of humans?

“Ultimately though, the biggest change we are facing now is how will we define ourselves when work ceases to be the centrepiece upon how we introduce ourselves?”

There are some pretty confronting statistics flying around about the amount of jobs the world looks like it will be losing over the next decade. Yet with all this uncertainty, and doom-and-gloom there are also some pretty startling opportunities. People have this exciting reason now to reinvent themselves, and to learn new things. Is what we do really what defines us? If and when machines begin to dominate the workforce, will people be free to start exploring who they are, and what makes them truly happy?

Academy Xi Blog

Design Ethics for Artificial Intelligence

By Charbel Zeaiter

Share on facebook
Share on linkedin
Share on twitter

After a great recent weekend at Future Assembly in Melbourne, I was compelled to repeat my talk, “Design Ethics for Artificial Intelligence” (slightly abridged).

Artificial Intelligence has been edging its way into our reality for a while now and it’s a topic that’s been discussed for decades. The fear of humans becoming slaves to AI is an interesting fear; some observers would say we’re already slaves to our devices and gadgets, therefore slaves to intelligence outside of ourselves.

The purpose of my talk was not to paint the expected doomsday view of AI and its possible effects on humanity but to open up the discussions on the complexity of embedding value systems in relation to decision making.

Knowing that to take action we need to assess a situation and make a judgement call. Where do these judgement calls come from? They’re our value systems and they’re complicated.

Using Isaac Asimov’s Three Laws of Robotics, I posed a single scenario (with some variants each time) and asked the audience to make a judgement call based on different value systems:

  1. Emotional
  2. Economic
  3. Probability
  4. Religious
  5. Environmental

With a central character, Caitlin, our robotic butler, we posed these scenarios and presented a choice that she had to make based on the above value systems.

A great Q & A followed, exploring the frustrating, flawed, emotional and highly subjective complexity of the human condition aka, our value systems.

When designing Artificial Intelligence, what are we really designing? Further more, what happens when Artificial Intelligence is no longer Artificial and can ponder its very existence?

Download the presentation and discuss it at work and home. I’m not at all worried about AI, if it’s left alone; I’m concerned when humans, who are fundamentally flawed, design decision making into immature intelligence.