Skip to content
RMIT University Library - Learning Lab

Artificial intelligence tools

 

Artificial intelligence tools

Discussions of artificial intelligence, or AI, are everywhere these days. With each new week there seems to be a new tool that can write text, create images, or solve other problems. But what exactly is AI, and just what can it—and can it not—accomplish?

What are artificial intelligence and AI tools?

Artificial intelligence is a branch of computer science. Its goal is to create computer programs or tools that can mimic human intelligence and complete tasks that generally require human skills. AI tools are helping with critical thinking and analysis tasks in many fields, such as medicine and engineering. Recently, many new tools have been released that can produce writing and create art, such as OpenAI’s ChatGPT and DALL-E, Google’s Bard, and Adobe’s Firefly.

Two students walking

Figure 1. Image generated using NightCafe from the prompt "two students walking together on a university campus in the style of a post-impressionist painting".

How does an AI tool generate text?

Tools that generate text are a type of AI called a ‘large language model’ (LLM). They have this name because they are large, very complex computer programs that are fed billions of words of text (a process called ‘training’) until they become good at understanding the patterns of language—not only grammatical patterns, but also which words and phrases are related to each other and how they tend to appear together in a text.

When you ask one of these tools a question, it analyses your question and tries to answer you based on the patterns it knows. These generative-text tools are based on statistical models of language patterns, so as they write your answer, they are continually analysing probabilities to guess which word or phrase is most likely to appear next in a piece of clear and coherent writing.

It’s important to understand that LLMs are not search engines. While they can make it sound like they have all the answers because they’re so good at writing convincing language, they don’t actually ‘know’ anything. They are simply writing the most reasonable response based on probability. This is why they don’t always provide accurate information, and why some tools are unable to provide sources for the ideas in their writing.

How does an AI tool generate an image?

Like generative-text tools, AI tools that create images based on a text description are trained on very large sets of data until they’ve learned what patterns make an image look realistic—for example, the usage of colour, texture and lighting, and the way objects are arranged—as well as what text descriptions match different types of images.

Two students holding hands

Figure 2. Image generated using NightCafe from the prompt "two students holding hands on a university campus in the style of a colour portrait".

Image generating tools can create art in different styles because of the large number of images they’ve already analysed. Just like generative-text tools, they sometimes produce strange results because they do not understand anything about what the real world actually looks like—they are simply mimicking the visual patterns of millions of other images. It should be noted that in many cases, artists have not been consulted before their work was used to train these tools, leading to ethical concerns.

Can I use generative-AI tools to help with my assignments?

It’s important check with your instructor whether you’re permitted to use AI tools when completing an assignment. Even if using AI tools is allowed, relying on them too much could mean that you don’t learn the necessary skills of your course. And if you use AI tools when it isn’t permitted, that would be a breach of academic integrity.

Another important point is that many generative-text AI tools, such as the current version of ChatGPT, were not designed to provide sources. Because they are simply models of how language works, they have no concept of 'information' or where it might have come from. This means that when asked for the source of information, they will make up books and journal articles that sound real, but don't actually exist.