Academy Xi Blog

Data analysis methods and techniques

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter


Are you new to data analysis and keen to find out what makes the industry tick? Read on to discover the methods and techniques that are the driving force behind one of the world’s fastest growing professions.

What is data analysis?

Data analysis is the process of organising, cleaning and examining raw data in order to draw conclusions and create meaningful solutions. In short, it’s about making sense of data to make well-informed, real-life decisions.

Data Analysts sit at the intersection of business, technology and statistics. With the primary goal of increasing efficiency and improving performance, Data Analysts discover patterns in data and make strategic recommendations.

Why is data analysis important?

Many businesses have a wealth of raw data at their fingertips, but data that sits untouched in a spreadsheet is of minimal value and translating it into actionable insights is easier said than done.

Data analytics helps a business tap into a vital resource and better understand its products, customers and competitors, as well as its own operational procedures and capabilities.

Armed with the knowledge data analysis brings about, businesses are able to identify inefficiencies and opportunities. Because data is not an opinion or a theory, it can act as an impartial source of truth when making important business decisions.

What is customer analytics?

Customer analytics is the process of collecting and analysing customer data to gain insights toward customer behaviour. Customer analytics helps businesses make smarter decisions that build stronger connections with customers.

Companies use customer analytics to develop customer-centric business strategies related to marketing, product development, sales, and more. This could include planning your entire customer journey, or building personalised marketing campaigns.

What is the data analysis process?

What is the data analysis process

The data analysis process can be broken down into a few simple steps:

✔️ Step 1: Identify – The first step is to define the problem you’re trying to solve and figure out what kind of data is likely to help you find that solution. 

✔️ Step 2: Collect – Now that you know what data you’re after, your next job is to collect it. Data might be extracted internally, or gathered from external sources. 

✔️ Step 3: Clean – Before it can be analysed, data needs to be cleaned. This involves removing any erroneous information that will distort your results, also known as ‘dirty data’. 

✔️ Step 4: Analyse – With your data cleaned and prepped, it’s time to get down to the all-important analysis. There are a range of tried-and-trusted analytical techniques and approaches, some of which we’ll unpack in a little while. 

✔️ Step 5: Interpret – Now you’ve analysed your data, you should be able to interpret the findings in relation to your original problem. This will allow you to make a data-backed recommendation to the relevant stakeholders. 

Essential types of data analysis

Here’s a list of definitions for the most important types of data analysis:  

  • Exploratory – ‘How should I use the data?’

Exploratory data analysis (EDA) techniques are used by Data Analysts to investigate data sets and summarise their main characteristics, often employing data visualisation methods. 

This helps determine how best to manipulate data sources to get the answers you need, making it easier for Data Analysts to discover patterns, spot anomalies, test a hypothesis, or check any underlying assumptions.

  • Descriptive – ‘What happened?’

Descriptive analytics is a simple, surface-level form of data analysis that clarifies what has happened in the past. This involves using data aggregation and data mining techniques. 

For instance, a company that monitors its website traffic might mine that data and find a day when the number of visitors dipped dramatically.

  • Diagnostic – ‘Why did it happen?’

Once an anomaly has been identified, a Data Analyst will then look at additional data sources which might tell them why this occurred. The analyst is searching for causal relationships within the data, which could mean using probability theory, regression analysis, filtering, or time series analytics. 

Following our example, the Data Analyst might consult data about the company’s day-by-day advertising spend and discover that certain advertising channels were switched off on the day the website traffic decreased. 

  • Predictive – ‘What is likely to happen?’

This is when Data Analysts start to come up with data-driven insights that a company can act on. Predictive analytics estimates the likelihood of a future outcome based on historical data and probability theory. 

While predictive analytics can never be completely accurate, it does eliminate the guesswork from making crucial business decisions.

Using the example above, the Data Analyst could make a reasonable prediction that temporary reductions in advertising spend are likely to yield a short-term drop in website traffic.

  • Prescriptive – ‘What’s the best course of action?’

Prescriptive analytics advises a business on which course of action to take and aims to take advantage of any predicted outcomes. 

When conducting prescriptive analysis, Data Analysts will consider a range of possible scenarios and assess the consequences of different decisions and actions. As one of the more complex forms of analysis, this may involve working with algorithms and machine learning. 

Using our example, the Data Analyst might recommend that the business maintains a more even day-by-day advertising spend in order to generate consistent levels of website traffic.   

  • Inferential – ‘What are the larger implications?’

When conducting people-focused data analysis, you can normally only acquire data from a sample group, because it’s too difficult or expensive to collect data from the whole population you’re interested in.

While descriptive statistics can only summarise a sample’s characteristics, inferential statistics use your sample to make reasonable guesses about the larger population. Though data might have been collected from a hundred people, you could use inferential statistics to make predictions about millions of people.  

With inferential statistics, it’s important to use random and unbiased sampling methods. If your sample isn’t representative of your population, then you can’t draw large-scale conclusions.

Types of data analysis methods

Here’s a simple breakdown of the most popular and useful methods that modern Data Analysts rely on:  

Cluster analysis

This is an analysis method whereby a set of objects or data points with similar characteristics are grouped together in clusters. The aim of cluster analysis is to organise observed data into meaningful structures in order to make it easier to gain further insight from them.

Cohort analysis

This is a form of behavioural analytics that breaks data (normally attached to people) into related groups before analysis. These groups, or cohorts, will share common characteristics and experiences. 

Cohort analysis allows a business to see patterns within the life-cycle of a customer, rather than analysing all customers without accounting for their position in a customer journey.

Regression analysis

This method enables analysts to accurately identify which variables impact a data set of interest. It helps analysts confidently determine which factors matter most and which can be ignored, as well as how certain factors influence each other.

To give a very simple example, a person’s weight will definitely increase as their height increases. Analysts look for these kinds of causal relationships between statistics because they clarify how one factor will affect another, making outcomes predictable.

Neural networks

Used in machine and deep learning, neural networks are a series of algorithms that replicate the neuro functions of the human brain. Each of these neurons:

  • Receives data from the input layer
  • Processes it by performing calculations 
  • Transmits the processed data to another neuron

How data moves between neurons within a network and the calculations performed will depend on what data findings are uncovered along the way. Though a neural network makes decisions about what to do with data all by itself, it first needs to be trained with data inputs.

IBM Watson is powered by 2800 processor cores and 15 terabytes of memory. Its neural network was trained over a period of two years, as millions of pages of patient records, medical journals and other documents were uploaded to the system. IBM Watson is used to accurately diagnose rare illnesses and diseases.  

Time series analysis

This method involves analysing a sequence of data points collected over an interval of time. In time series analysis, analysts record data points at consistent intervals over a set period of time, as opposed to recording data points intermittently or randomly. 

The advantage this offers over other methods is that the analysis can show how variables change over time. In other words, time is a crucial variable because it shows how the data evolves over the course of its life cycle, rather than just focusing on the end results. 

How can I become a Data Analyst?

Want to learn more about all things Data Analytics?

Academy Xi has a proven track record of helping people boost their skillset and revamp their role with Data Analytics, or even launch a completely new career as a Data Analyst.

We offer a range of courses built and taught by industry experts that are specially designed to suit your ambitions and lifestyle:

  • Harness the power of data in your career even as a non-data professional by upskilling with our Data Analytics: Elevate course.
  • Change careers with our Data Analytics: Transform course and get access to a Career Support Program that helps 97% of graduates straight into the industry.

If you have any questions or want to discuss your career prospects, speak to a course advisor and take the first steps in your Data Analytics journey.


Academy Xi Blog

Data modelling techniques, concepts and tools

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter

Seeking consistent, reliable results in business? Discover how Data Modelling can produce high quality, structured data to enable just that.


What is data modelling?

In the world of Software Engineering, data modelling is a process which simplifies the data model of a software system. This data model is used as a basis, or blueprint, for creating a new and improved version of the system.

In data modelling, data is expressed as a series of symbols, diagrams and text to provide a visual representation of how the elements of data interrelate. 

Why is data modelling important?

When a database is created it isn’t done so in a vacuum where it is never to be improved, upgraded or evaluated. To organise data and make it available as required, databases need to be evolving. Data modelling enables improvement and consistency with security, naming conventions and rules, and ultimately supports the improvement of data analytics. 

Data modelling encourages better performing data with reduced errors, which means the overall quality of the data has improved.

Companies are also able to abide by national and global laws relevant to their industry regulations when data modelling is routinely applied.

If you want your teams to make data-driven decisions, then making data modelling a standard part of your organisation’s IT approach will ensure they are accessing the best quality data possible. 

Best data modelling techniques and concepts

There are five different varieties of techniques that are implemented to organise data, as follows:

  • Hierarchical data model technique

As the name would suggest, this technique has a hierarchy to it, with a tree-like structure. Data is gathered to one root, which branches off containing other connected data, extending the tree.

You may see this technique applied in HR terms for a company structure, if it has a hierarchical approach, with a number of employees reporting to one department.

  • Object-oriented data model technique

A collection of objects or components is referred to as an object-oriented data model. There are three types of models in this type of data design: Class, state and interaction.

  • Relational data model technique

Data arranged into columns and rows within tables makes data easy to identify, as it is clearly ordered. This is one of the reasons it is a very popular model to implement. This technique is used to describe relationships between entities.

  • Entity relationship model

Sometimes referred to as an ER model, this approach is taking real-life elements and the connections between them. This model groups data into general attributes, entity sets, relationship sets and constraints. 

  • Logical data model

To devise a technical map of rules and structures for data, which can then be applied to specific project needs, a logical data model would be the answer. It’s a more honed understanding of data entities and the interconnectedness between them.

  • Graph data model

Where the relational model focuses on information being arranged within a list format, this model spotlights the relationships occurring between information.

  • Conceptual data model

Simple and abstract, the conceptual data model is very popular as it can communicate ideas with ease, which is important when presenting to a range of people, particularly if you are seeking buy-in with an idea. This model provides a structured business view of the data needed to ensure the business processes are optimal.

The best data modelling tools and uses

Data modelling techniques and concept tools infographic

  • Microsoft Excel

Deemed to be one of the most universal of all data modelling tools, Excel offers its formulae to support your data gathering and equations. You can build a relational data source inside an Excel workbook, with tabular data used in PivotTables and PivotCharts.

  • Python

Programming language, Python, can create and manage data structures rapidly and provides a range of tools for data analysis and manipulation. Data structures and datasets can quickly be represented.

  • Microsoft Power BI

Using Power BI will enable you to set the relationship between two objects. You can do this by dragging a line between the columns. Considered to be a better option for those newer to working with data, BI is simple to learn compared to other offerings and has an intuitive interface. However, it is slower when it comes to handling large quantities of data.

  • Tableau

Known for handling large volumes of data at speed, Tableau offers a wide selection of features for visualising data without restrictions on row or size counts, or the total number of data points. Experienced data analysts are big users of Tableau because it is more complex and requires a depth of knowledge and experience to maximise its features.


Open-source software for data analytics, KNIME is free to use and offers users additional options which are competitively priced in comparison to others. With this software users can create visual data pipelines and carry out whichever analysis steps they want. Results can be viewed using widgets. Support is offered via an online community with KNIME proving to be a solid solution for any business who wants an affordable and reliable data solution.

  • Salesforce Einstein Analytics

A platform from Salesforce, Einstein Analytics offers a suite of data analytics applications that enable users to dive into predictive analytics and gain fast answers to key business queries and support the understanding of their customer base. Artificial Intelligence is used to offer insights and build AI data visualisations, which support companies in reaching their goals.

  • Oracle Analytics Cloud

Being cloud based, Oracle Analytics gives access across an organisation to data via a single platform, using any device. Data analysis can occur in the cloud, with insights being pulled from ERP data. Oracle can help users discover what is driving business profitability, along with opportunities for growth.

How to get into data modelling

At Academy Xi, we offer flexible study options in Data Analytics that will suit your lifestyle and training needs, giving you the perfect foundation for your future in data modelling.

Whether you’re looking to upskill or entirely transform your career path, we have industry designed training to provide you with the practical skills and experience needed.

If you have any questions, our experienced team is here to discuss your training options. Speak to a course advisor and take the first steps in your Data Analytics journey.

what is quantitative and qualitative data

Academy Xi Blog

Quantitative vs qualitative data: methods, differences and examples

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter
what is quantitative and qualitative data

In a nutshell, quantitative data is all about numbers and statistics, whereas with qualitative data we’re talking words and meanings. Read on to discover when to use which data research approach and what kind of methods you might consider.

What is quantitative data?

What is quantitative data

Data that is expressed as a defined amount, quantity or within a specific range is referred to as quantitative data. Quantitative data can be counted or measured, so it is common for the data to be stated with a unit of measurement. 

Some examples of quantitative data: kilograms when referring to the weight of something, or metres, kilometres or centimetres in reference to distance. Methods of quantitative research might include experiments or closed-ended question focused surveys.

What is qualitative data?

What is qualitative data

Qualities or characteristics of findings are described with qualitative data. This variety of information can be gathered using observation techniques, interviews, focus groups or questionnaires, and is often presented in a narrative format. Some examples of qualitative data: video recordings, case studies or interview transcriptions.

When to use qualitative vs. quantitative research

Ultimately, you will be best placed using a quantitative approach if you are needing to test to confirm something, such as a theory. 

Qualitative research is the way to go if you want to understand characteristics or traits of trends, or to determine the boundaries for larger data sets.

Both data research options enable you to answer different kinds of questions, so the choice you make will largely be driven by what you’re trying to answer or respond to.

There’s also the possibility of taking a mixed approach, again depending on what you’re trying to answer.

How are quantitative and qualitative data collected?

quantitative and qualitative data collection

Quantitative data collection

Qualitative data collection

Observation: watching within natural setting with no variable control

Publication reviews:Analysis of texts by various authors on the relevant topic areas

Experiments: variables controlled and manipulated to create cause and effect relationships

Ethnography: close observation of behaviours within a predetermined group for an ongoing period of time

Focus groups and surveys: interviewing with a closed question or multiple-choice approach

Focus groups and interviews: discussions within one-to-one and group settings to compile opinions, using open-ended questioning

How to analyse qualitative and quantitative data

When it comes to the analysis of data, the methods unsurprisingly alter for each data approach.

Quantitative data analysis

As we’re dealing with numbers, statistical analysis is often applied to establish data patterns, with outcomes plotted in graphs or tables.

Generally speaking, you could be looking to discover things such as average scores, reliability of results and how many times a certain answer was provided.

Preparation of the data before it is analysed is incredibly important. The data gathered needs to be validated, any known errors removed, and remaining data coded. This process ensures the best quality data is going to be analysed and provides a more accurate and helpful outcome.

The two most used quantitative data methods for analysis are inferential statistics and descriptive statistics.

  • Inferential statistics: show relationships between multiple variables, which means predictions can be made. Correlation explains the relationship between two variables, whereas regression shows or predicts the relationship between two. Analysis of variance tests how much the two variables differ from each other.
  • Descriptive statistics: provide absolute numbers, but don’t explain the reasoning or context behind them. Useful to apply when there is a limited amount of research available and mostly used for analysing single variables.

Qualitative data analysis

As we’re dealing with words, images or video content, qualitative data can be more challenging to analyse.

Examination of any recurring themes within the data is a helpful approach to take, as is exploring the frequency of use of phrases or words. The idea, like quantitative data analysis, is to discover patterns.

Methodologies which could be used include:

  • Grounded theory: establishing new theories from the data
  • Thematic analysis: identifying patterns in meaning to determine themes
  • Content analysis: interpretation of meaning from body content
  • Narrative analysis: discover how research participants construct story from their own personal experience

Best data collection tools & techniques

Now that we’ve looked at various approaches to gathering data, let’s look at some specific tools.

  • Qualitative data tools & techniques

While we might find ourselves using focus groups or interviews as a technique to collect data, tools such as ‘sentence completion’ or ‘word association’ can provide a wealth of further data to explore. With sentence completion, an individual is given a part-sentence to complete, and the answers provided give us an insight into the views and ideas of that person. Word association performs a similar function, where the individual is asked to share what comes to mind when they read or hear particular words.

  • Quantitative data tools

When it comes to drilling down into the digits, you might embrace statistical software options such as SPSS, JMP, Stata, SAS, R or MATLAB.

How to get into Data Analytics

Arm yourself with quality industry-aligned training that teaches you the process of collecting, organising, cleaning, and analysing raw data to identify patterns and draw conclusions.

With study options to suit all levels of ability, Academy Xi has you covered:

Do you have any questions? Our experienced team is here to discuss your training options. Speak to a course advisor today and take the first steps in your Data Analytics journey.

Academy Xi Blog

Student Spotlight: Jocelyn Fisher

By Academy Xi

Share on facebook
Share on linkedin
Share on twitter

With a passion for problem solving and an eye for detail, Jocelyn is a perfect match for a career in Data Analytics. Find out how the Data Analytics: Transform has helped Jocelyn go from complete beginner to industry-ready in just 15 weeks.

Keen to kickstart a new career that would play to her professional strengths, Jocelyn decided to enrol in the Data Analytics: Transform course.

Can you tell us about your training and career history before Academy Xi?

I studied Environmental Science, but my last few roles have been in administration. Most recently, I’ve been working in administration with New South Wales Health.

Before I joined Academy Xi, I really didn’t have any experience with Data Analytics. I completed one module on statistics at university, which did help me understand a bit of the jargon, but other than that I was completely fresh to the discipline.

Why did you decide to study Data Analytics?

I was searching for an online course that would enable me to change my career. I wasn’t sure what to study, so I completed a few online quizzes that help you identify a career based on your professional strengths and personality. The quizzes all highlighted my attention to detail, as well as my ability to recognise patterns and solve problems. As you might guess, Data Analytics kept popping up as a career suggestion.

I did some research into Data Analytics and it definitely appealed to me. My partner even gave me his old Data Analytics textbook. I read a chapter on SQL and actually really enjoyed myself! I don’t normally respond well to learning theory in textbooks, so that really solidified the idea that Data Analytics is the course I should go for.

Why did you choose to study with Academy Xi specifically?

Paying for a course is a big commitment, so I was very thorough in choosing who to study with. While I was researching Data Analytics courses, Academy Xi kept popping up in my Google searches.

All the information about the Academy Xi course was really clearly broken down online. I still had a few questions, so I spoke to one of your course advisors. They were incredibly friendly and helpful, which I took as a positive sign for what the Academy Xi experience would be like in general.

Jocelyn Fisher

The Academy Xi course was one of the most moderately priced and covered an enormous amount of content in just 15 weeks. Plus, it offered a long period of career support. In the end, it was definitely the best deal on offer and I decided to enrol.

What would you pick as the highlight of the course?

By far the biggest highlight for me was learning to write Python code. I didn’t have any previous experience with coding, but now I know how to write in a programming language, which is absolutely marvellous. 

In particular, I really enjoyed creating data visualisations with Python and Matplotlib. I’m the kind of person who likes to make things look pretty, so the whole process of converting data into graphs and charts was very satisfying! It’s a handy skill to be able to tidy up data and present it in a stakeholder-friendly way. 

Plus, I was really proud of my capstone project and pleased to come away with a portfolio of really fun and interesting work that I can show to prospective employers. 

What projects did you work on?

For my capstone, I completed a predictive modelling project that used data from the tabletop role-play game Dungeons and Dragons. I made a model that accounts for 91% of the variance of the data, which calculates the challenge rating score for any player-created monster. I’ve tested out the model with a real game and it actually works. I’m a complete Dungeons and Dragons nerd and just couldn’t resist coming up with a project based on something I’m passionate about! 

Beyond the coursework projects, I also wrote a blog. I reached a stage at which point I was ahead of schedule with all the coursework, so I asked my mentor if there was any extra work I could complete. He suggested that I write a Data Analytics blog. I decided to give it a shot and wrote an article about colour blindness and data visualisation. The article explains how Data Analysts can use tools within Python that make graphs and data visuals accessible to people who are colourblind. Writing the blog was optional, but I’m glad I decided to go ahead with it. Having a piece of published writing is a great addition to my portfolio. 

How did you find studying online?

For me, studying online wasn’t a problem at all. I know some people prefer face-to-face, but I really enjoyed the flexibility of completing the course remotely. 

We had two sessions a week for two hours on Tuesdays and Thursdays. I didn’t find that level of commitment too demanding and I didn’t find it not enough. It allowed enough time in between to carry on working in my role and complete my coursework assignments. 

It also helped that our cohort was really friendly. If anyone had a question or was struggling with something in particular, they could post it in the class channel on Slack. People were always happy to help and give advice. Honestly, I never felt like I was studying alone.  

How did you find working with your mentor?

My mentor JP was fantastic. I really enjoyed JP’s approach to teaching and the energy he brought to the live sessions. His experience really shone through in the classes. He knew the course material inside out and was able to answer our questions and discuss everything in detail.

Jocelyn Fisher

We also got to see some of JP’s professional projects during the classes. He showed us a tracking system for a ship port that he’d built for a previous employer’s website. It was cool to see how we could go from something as basic as learning to code with Python, to building such a comprehensive project. Seeing JP’s professional work was inspiring and gave me a lot of motivation. 

Did you benefit from your 1:1 sessions with JP?

Without a doubt. It was great to be able to book those one-on-one meetings with JP, especially when it came to completing the projects. I’d book a session whenever I got stuck. In the end, I realised it was much better to book a session before I got stuck! We’d go over what problem I wanted to address, figure out what my outputs would be and plan my methodology.

JP even went as far as to extend the sessions beyond the allotted 30 minutes. His enthusiasm to help the students was very clear, and I really appreciated his level of commitment.

How’s your job search going?

I’ve just finished signing up for the career toolkit, which is the first step in the Academy Xi Career Support Program. I’ve now got six months of career assistance to help me land that first position.

I’ll probably be looking for something contractual or temporary, just because I’m totally new to the industry. That will give me the chance to build up my experience and then I’ll be in a strong position to apply for something more permanent.

Do you feel like the course has prepared you for a new role?

Absolutely. There are definitely areas of knowledge within Data Analytics that I know I’ll need to add to with time. That said, I’ve got the fundamentals under my belt and feel more than capable of working in an entry-level position. Entry-level positions exist for a reason. I’ll need to spend a few years working in the industry to really fine-tune my skills and build up my know-how.

There are a few modules from the course that I’d already like to deepen my knowledge in. There are all kinds of resources and free tutorials online, so I’ll work my way through some of those while I’m applying for jobs. Once I’ve spent a bit of time in the industry, I’ll have a clearer sense of where the gaps in my knowledge are and then aim to address those specifically.

Would you recommend Academy Xi?

100%. Even for somebody interested in courses beyond Data Analytics, I’d say go for it. I can tell from my personal experience that Academy Xi really knows how to structure its courses and does a great job of working with an online system. Whatever course you might be interested in, I’m certain it’s going to be really useful to you and your career. 

I started the Data Analytics course as a complete beginner. By graduation, I could write my own code with Python, create amazing data visualisations and had built a pretty impressive portfolio. For anyone interested in starting a career in digital industries, an Academy Xi course is a great way to get the ball rolling.

Jocelyn Fisher

Are you keen to bring the power of Data Analytics to your career just like Jocelyn? Check out our Data Analytics courses.

Search our website

Find the right course