The impact of artificial intelligence for developers

Recently, tools like GitHub Copilot and ChatGPT have generated quite the controversy. In many cases, AI has made tasks faster, simpler and more efficient, freeing up time for developers to focus on more creative or strategic work. It cannot be all good, right?

Try to imagine a life without artificial intelligence (AI). It’s hard, isn’t it? We use AI every day, whether it’s to find the best route to work, to get a recommendation for a new playlist or to get a quick answer to a question. AI is everywhere, and it’s only going to get more prevalent in the future. But AI has not been without controversy. In fact, it’s been a hot topic recently.

Photo by Mike MacKenzie on

Flickr

Assisted coding in interviews

How would you react if you found out that the job candidate you were interviewing was using AI to answer your questions?

That's exactly what happened recently at a job interview. Half of the interview team were upset to find out that the candidate had used GitHub Copilot to answer their questions. The other half was impressed by the candidate’s ability to use resources available to them. Which side of the fence do you sit on?

Personally, I think it’s a bit of a non-issue. If you’re interviewing a candidate, you should be assessing their ability to solve problems, not their ability to type code. It would be like asking a candidate to solve a maths problem on a whiteboard, and then getting upset when they use a calculator to help them. I think coding interviews are a waste of time anyway, but that's a topic for another day.

When the candidate uses ChatGPT to answer my opinionated questions in real-time, I’ll be so impressed by their ability to hide reading a script from me while typing in prompts rapidly, I might just have to hire them.

Assisted coding in the workplace

Let me preface this by saying that if you are still inexperienced in your career, you should not be using AI to write code for you to get a job above your level. Your colleagues will find out very quickly whether you truly know what you are doing or not. And if you don’t, you’ll be out of a job very quickly with a tarnished reputation.

With that out of the way, let's look at a piece of code.

This function tries to safely find an asset from a list of assets based on it's key.

export default function parseAsset(assetsToParse, assetKey) {
  const asset = assetsToParse.find((x) => x.key === assetKey)

  if (!asset) {
    throw new Error(`No asset found with key ${assetKey}`)
  }

  if (!asset.value) {
    throw new Error('Provided asset has no value')
  }

  return asset.value
}

To test this using Jest, write the following to get 100% coverage:

// noinspection JSLastCommaInObjectLiteral,JSLastCommaInArrayLiteral

describe('parseAsset', () => {
  test('throws if no asset found for given key in asset list', async () => {
    const assetsToParse = []
    const assetKey = 'some-key'

    expect(() => parseAsset(assetsToParse, assetKey)).toThrow(
      `No asset found with key ${assetKey}`,
    )
  })

  test('throws if no asset value found for given key in asset list', async () => {
    const assetsToParse = [{ key: 'some-key', value: undefined }]
    const assetKey = 'some-key'

    expect(() => parseAsset(assetsToParse, assetKey)).toThrow(
      'Provided asset has no value',
    )
  })

  test('returns parsed asset', async () => {
    const assetsToParse = [
      {
        key: 'some-key',
        value: 'Test',
      },
    ]

    const assetKey = 'some-key'
    const parsedAsset = parseAsset(assetsToParse, assetKey)

    expect(parsedAsset).toEqual({
      name: 'Test',
    })
  })
})

Are you at all concerned whether the function and it's tests were written with assistance from GitHub Copilot or ChatGPT?

I'm not. I'm more concerned about if the code works, is well tested and is easy to understand. If you are an engineering manager, would you want your team to be spending time on tasks that could be automated, or on solving problems that are unique to your business? If you are a software engineer, would you want to spend time writing code that you could have written with AI in half the time?

I think the answer to both of those questions is a resounding no.

Assisted coding in the future

Everyone is concerned about AI taking over their jobs. Do I think that will happen? No, I don’t. [Insert joke about AI having to understand client requirements first].

AI is trained to do a specific task, and it’s very good at that task. But it’s not good at everything. It’s not good at understanding the context of a problem, or the nuances of a situation. Your specific scenario that you are trying to solve at your company is incredibly unlikely to have been solved by AI before on public training data. You will still have to do a large amount of work to get your AI to work for you. That's why it's called Copilot, not Pilot.

AI is a tool, and it’s a tool that can be used to make your life easier. But it’s not a replacement for you or your team. AI is not going to take your job, but it can make your job easier.

Ethics and other downsides

AI is not without its downsides. It can be used to do bad things, and it can be used to do good things.

You will always find malicious actors who will misuse technology for their own gain. That’s not a reason to stop using technology, rather it’s a reason to be more vigilant. It's also important to remember that AI is trained on data. If the data is biased, the AI will be biased. If the data is using copyright infringing material, the AI will be generating copyright infringing material. As developers, we don't have control over the data that AI is trained on, but we do have control over the choice of AI tools we use.

It is up to you to make the right choice. Investigate the tools you are using, and make sure they are not being used to do harm.

Final thoughts

As you can tell, I'm hugely in favour of AI in development. Each month I see the debit order for GitHub Copilot come out of my account and my first thought is "What a damn pleasure".

Would it surprise you to know that I wrote this article using GitHub Copilot and ChatGPT to help me write the script? If you read the article, it shouldn't. I'm a developer, not a copywriter. The fact that writing this article took hours of hand-holding the AI to get it to write something that I was happy with and performing manual writing most of the time, is proof that your job is safe.

I'm not saying that we should ignore the downsides or ethical implications of AI, it has greatly enhanced my daily work, and I'm sure it will do the same for you.

Resources