All 1 Debates between Baroness Byford and Lord Bishop of Chester

Wed 8th Feb 2017
Digital Economy Bill
Lords Chamber

Committee: 4th sitting (Hansard): House of Lords

Digital Economy Bill

Debate between Baroness Byford and Lord Bishop of Chester
Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - - - Excerpts

My Lords, this is an important amendment because it touches upon the bigger issue of the impact of artificial intelligence on all sorts of aspects of our lives. There is a law called Moore’s law, which says that every two years the power of computers doubles. That has been true over the past 20 or 30 years and we should assume that that power will continue to develop. Artificial intelligence in all its impacting forms will be more and more prevalent in our society and more and more potent in the hands of terrorists in the years to come.

We cannot ask Ofcom to solve all the problems in this area, but I would like to know where the ownership of these risks and the rapid changes in our society falls in the eyes of the Government. Perhaps Ofcom has a role in this regard—search engines or whatever—but it is really part of a bigger picture of how we get ahead of the game with the impact of artificial intelligence. We read in the papers about driverless cars appearing on our streets, and in many other areas of life artificial intelligence will impact upon us. Where is this owned in the corridors of government?

Baroness Byford Portrait Baroness Byford (Con)
- Hansard - -

My Lords, I would like to support my noble friend in his amendment. Algorithms are basically mathematical. The power of computers is used to record, classify, summarise and project actions that indicate what is happening in the world around about us. Algorithms can be applied in particular to social media, which other noble Lords have referred to, and to normal internet usage and browsing. They reach decisions about public interest, about you and about me.

According to a recent radio programme, algorithms are used to make individual decisions in the fields of employment, housing, health, justice, credit and insurance. I had heard that employers are increasingly studying social media to find out more about job applicants. I had not realised that an algorithm, programmed by an engineer, can, for example, take the decision to bin an application. If that is true, that is absolutely unacceptable. It certainly explains why so many jobseekers do not receive a response of any kind. There is a very real danger that a person could also be refused a mortgage or a better interest rate as the result of an algorithmic decision. Even now some companies use algorithms based on phone numbers to decide whether a caller is high or low value. Highs get to speak to a person: lows are left holding on until they hang up. Algorithm designers refuse to answer any questions, I understand, about the data that are used or their application on grounds of commercial confidentiality. There are real concerns that if we continue to allow such liberties, there will be an increasing risk of discrimination—intentional or accidental—against people of certain races, religions or ages. One example of algorithm use cited in the programme was that of differential pricing by Uber.

The EU intends that by July 2018 citizens will have the right to an explanation of decisions affected by the workings of these algorithms, such as the online rejection of a bank loan. I do not feel that we should wait until then, and although my noble friend’s amendment might not be perfect, I am really grateful that he has tabled it today and that we are having this worthwhile debate.