This Does Not Compute: part 3

Published 16 May 2022

Technology has changed virtually every aspect of our lives – from our daily communications and business to shopping and recreation.

In the final part of our Law and IT series, titled This Does Not Compute, UNE’s Professor Mark Perry,* editor of the newly released Legal Issues in Information Technology, considers where we are heading – in daily life and within the legal profession itself.

“Laws change with time and circumstances,” Mark says. “but as demonstrated by the COVID regulations, having rapidly changing laws make it hard for civil society to know what they are supposed to be doing on any given day. We need a well-principled core set of legislation that can be seen as being technology independent, but applicable to unknown forthcoming circumstances.”

In this series:
This Does Not Compute: part 1
This Does Not Compute: part 2

So what key legal issues do you think we will grapple with in future?

"Certainly, fact-checking of information shared online. There are a number of fact-checking groups being set up within social media platforms like Facebook, as well as the media (such as the BBC and New York Times), to verify if material is false or not.

"In the next few years, I think we will see blockchain-based environments being used increasingly to ensure the integrity of the data being used. It’s much harder to refute something that is being alleged if it is recorded inside one of the distributed ledgers, so that’s a great deployment of technology.

"The issue is: do people really care? I don’t think people care enough about whether facts are true or not, which is sad. That’s a deeper philosophical question.

"What is more problematic are the Artificial Intelligence engines, which are at the moment built by people, and have their own biases built in. A recent example of AI engines being deployed is by Airbnb to check up on people when they register to rent a property – the engine trawls the internet and goes through all the social media sites to see whether you are likely to be the kind of person who will end up trashing the place.

"At the moment, if the AI thinks you are a high risk, then it will flag the application and that would go to a human to evaluate. In the future it probably won’t go to a human; the engine will make the decision. Using AI for decision-making, whether for renting, health, or vehicle control, opens up a plethora of potential liability issues."

What of cybersecurity?

"If it’s online, the bottom line is that it’s not really secure. The banks and other financial institutions try hard, but even they have their issues.

"The government tries to ensure their internal communications are secure, but they’re not really secure either. A decade ago ASIO’s new building blueprints were hacked and the stolen plans turned up in China. In 2016 the Australian census was shut down by a simple hack. Data leaks and insecure systems are a common occurrence in all sectors."

Within the legal profession, some of these IT issues are playing out, too. You’ve got online dispute resolution, and virtual lawyering and automated lawyering. Do you expect the legal profession will introduce such measures willingly?

"With the development of caselaw databases in the 1970s, legal research was an early adopter of using computer systems to find facts. With the recent wide acceptance of the virtual appearance and the use of digital everything, from signatures to AI computing resources, most of the profession can be seen as being onboard with such changes. I think most of them are quite happy to adopt whatever is effective, to try to spend less time (money) on aspects of their work that can be automated, for example form production.

"However, even with dispute resolution meetings happening online in Zoom rooms, you may wonder how secure the meetings are, who may be listening, and even eavesdropping remotely. You wouldn’t know.

"Trust is a big issue in law. Lawyers have to have the trust of their clients and, if the client is suspicious, it’s harder to get all the information that’s needed, take the necessary actions and have a good outcome. That’s an ongoing problem."

What do changes within the legal profession mean for clients? Will we have to rethink what the legal relationship looks like?

"In the past, trust has been developed in a legal relationship because of the legal representative’s reputation and through the client meeting them. They personally connect and it works. I think that will change because of the nature of the online environment. There is a difference between being in a room together and being a mouse click away. I think the trust factor will grow stronger and reputation will be key."

And what of the future of our engagement with IT in broader society?

"It is incumbent on all of us to come up with solutions that work, given the new online environment we now live in. Knee-jerk legal reactions to solve specific problems don’t make sense, and there have been lots of examples of that over the years. Either the new legislation doesn’t work or the industry and people have moved on in the time it takes to come into place. If there are legislative changes all the time, nobody knows what they should be doing."

Some would say that technology creates a situation where it is perhaps less likely that people will behave better, because they can get away with things.

"I became a lawyer with an impossible dream a long time ago. The dream was that our society would become more sophisticated and care more for others. As a computer scientist, I thought that technology would make life easier. That didn’t happen either; we are now slaves to our devices. Neither the technology nor the way it’s been deployed has helped.

"Most social media sites rely on mass attraction so they don’t care so much what is attracting people. For example, fake news legislation enacted recently in the ASEAN countries can be used to suppress real news, which they can label 'fake news' because it disagrees with the current government. That’s what happens when you get technology-specific legislation being misused."

If we can’t rely on people behaving better or the technology making lives easier, where does that leave us?

"It leaves us in a pickle. I worry for the kids in particular. Technology is so addictive, whether it’s games or chatting with people. You end up being in groups with people who have similar thoughts to you, so it reinforces the group’s thoughts, and that could be a quite large group of people online.

"So that leaves us with drafting or relying on legislation that deals with the broader issues. For instance, we already have a law about false advertising, the Australian Consumer Act, that says businesses can’t deceive customers. This, of course, applies online as well as in a shop."

So what conclusions have you reached?

"That we need greater discussion and debate; we need more people smarter than me to think about social solutions, and how they can be deployed on the internet, to encourage people to be more respectful and responsible.

"Humans are fallible and sometimes behave badly; that’s why we have laws and law enforcement. We have to continue to be adaptable and optimistic. There are always opportunities for things to be improved, and I just hope that people will continue to work to make that happen. Hopefully our book will generate some discussion and ideas to that end."

* Professor Mark Perry is a lawyer and computer scientist with the UNE School of Law.

In this story: