Incident 137: Israeli Tax Authority Employed Opaque Algorithm to Impose Fines, Reportedly Refusing to Provide an Explanation for Amount Calculation to a Farmer

Description: An Israeli farmer was imposed a computer generated fine by the tax authority, who allegedly were not able to explain its calculation, and refused to disclose the program and its source code.
Alleged: Israeli Tax Authority developed and deployed an AI system, which harmed Moshe Har Shemesh and Israeli people having tax fines.

Suggested citation format

Hall, Patrick. (2021-01-11) Incident Number 137. in McGregor, S. (ed.) Artificial Intelligence Incident Database. Responsible AI Collaborative.

Incident Stats

Incident ID
Report Count
Incident Date
Sean McGregor, Khoa Lam


New ReportNew ReportNew ResponseNew ResponseDiscoverDiscover

Incident Reports

A tax dispute between a ranch in Israel’s southern Negev desert and the country’s tax authority has given rise to an issue which could have far reaching repercussions on the public’s ability to oversee and even understand government acts.

The issue, currently being debated in the Justice Ministry, is whether a software or its source code can be seen as “information” the authorities are obliged to disclose to the public.

The story began in 2014 when the farm Har Shemesh, which is not part of any community, asked the Tax Authority to explain how it had calculated a fine they were required to pay.

The authority officials couldn’t explain how they had arrived at the fine’s final amount, claiming the calculation had been carried out automatically by their computer software. The farm owners asked for more details, eventually requesting the program or it's source code so that they could examine the formula by which their fine was calculated. However, the Tax Authority refused.

The District Court rejected the farm’s appeal to receive the program's code and accepted the state’s position. The farm then appealed to the Supreme Court and The Movement for Freedom of Information joined the case. The state retracted its position and the Justice Ministry is now debating the issue.

Software discretion

The farm, represented by its owner Moshe Har Shemesh, argued that the Tax Authority’s computer was programmed to exercise discretion instead of the authority’s officials. He said the software has the authority to do what tax clerks are supposed to do: imposes fines on belated income statements, deducts tax returns, slaps financial sanctions and even denies the right for deduction at source.

The authority’s decision to grant a software program these powers is a decision, he claims, which was made as an administrative measure, with no public discussion and in the absence of sufficient public awareness of the implications, he argued.

Not only was the discretion given to the computer, but the guidelines for carrying it out were not released to the public or even to the authority’s employees themselves, he said.

“When a computerized system is tasked with implementing the procedure, then the operation guidelines are programmed into the software. They are unpublished and unknown, apart from those ‘programming gurus’ who programmed them into the computer language,” he said.

Har Shemsh mentioned a class action lawsuit submitted by Aiad Mahajna, who discovered that the same tax authority computer would re-calculate fines over delayed VAT payments only when the consumer price index went up, but not when the index went down. The guidelines given to the program in the case had not been put in writing and neither the prosecutor nor the public had any way of understanding them. They could only be inferred by laborious data gathering. The judge ruled in the other case in favor of the plaintiff and ordered Israel to pay compensation over 2 million shekel.

The Tax Authority argued in court that it had no way to extract the guidelines from the software, because it wasn’t a matter of merely pressing a button, and that it would demand an unreasonable allocation of resources.

“To obtain the guidelines would require the authority to use reverse engineering techniques and to trace the programming processes,” the authority wrote. “It’s not at all certain that these techniques will yield the desired results. Also, it would oblige the authority to set up a team of programmers to identify all the computer’s software dealing with this request.”

The authority estimated that the task would take thousands of programmer’s work hours.

As for giving the software itself to the appellant so that they could extract the guidelines from it themselves, the authority claimed that this would endanger state security.

Such a move, the authority claims, could disrupt its function, because it opens the software to hostile agents who could attack the authority’s critical computer system and render it vulnerable to cyber threats and data leaks, thus disrupting the tax system and other things.

Legal programs

Jerusalem District Judge Ram Vinograd said software cannot constitute information, as it is rather an external product of it, or a “work tool.” The judge argued that the request was tantamount to if a government office would order an interior designer to rework their office so it would meet legal guidelines and thus the very office itself would become “information” by virtue of it being based on those requirements.

A work tool, whether it’s an office or a cutting machine or a program, he said, doesn’t turn into “information” because it was created to serve legal ends or meet legal requirements.

He also ruled that the authority, which created the software and had its guidelines programmed into it, is the software’s intellectual property owner. This property has financial value, even if the authority doesn’t plan to sell it.

In contrast, Har Shemesh argued that administrative guidelines, whether in a paper folder or a computer software, cannot be the authority’s private intellectual property, just like the Knesset isn’t the copyright owner of the legislation and amendments it issues.

As for financial value, just as it’s unthinkable for Israel to purchase from Iceland the guidelines to implement the latter’s tax laws, so it’s unthinkable that some state would buy Israel’s administrative guidelines.

He also said that claiming the software has financial value contradicts the claim that the software was secret and releasing it could harm state security.

“If giving the information to someone could harm state security, then obviously it has no commercial value, because the authority would never sell it and compromise state security,” he said.

The Movement for Freedom of Information, an apolitical NGO set up to advance the right for information in Israel, asked to join the appeal as a friend of the court and show why the term “information” consists of all information that can be passed on – including software, code or algorithm.

The NGO said the freedom of information law was intended to enable the public to supervise government activities and that in a technological world, where decisions are also made by computers – software must be open to scrutiny as well. The public cannot supervise the government if the law is “hidden” in a program's algorithm. Also, it totally exempts the gatekeepers from self examination.

The NGO’s activists said that for years government officials have been using various computer software, such as commercial off-the-shelf software, or software made for the state by commercial companies. In many cases the software arrives with only an executable file (.exe file) written in computer language, unlike a source code that is understood by the programmers. In this case, the entire supervision is conditioned on reverse engineering, in which prolonged experiments are conducted on the software to learn about its source code. In the absence of a human being to make the decision and without an open-source software, there is no real supervisor – a public employee, a gatekeeper or the public - on the way decisions are made.

Human beings, like Tax Authority employees in this case, have a tendency to rely on the machine’s decision, but the result the machine reaches isn’t necessarily the right one, and the machine’s decisions must be supervised. This isn’t a theoretical claim – a software assessing inmates’ danger in the United States, for example, attributed considerable significance to the detainee’s skin color and increased structural biases in the justice system.

Open-source state software

The NGO says considerations in designing the program or artificial intelligence now replacing, albeit partially, human judgment, must be open to the public. The privacy laws in Europe's GDPR require that in the case of automatic computer decisions, information is provided regarding the reasoning involved in the decision, its significance and its expected implications.A French law from 2016 obliges to explain decisions including the programing processes that led to it.

In order to do this, the state must have full access to software it operates, and even have be open-source, with the code and software being open to all. This allows it to be examined by others.

The use of open-source software will enable finding bugs in a program's code and allow oversight of the way the software was constructed or implemented. It will also enable the developers community to improve the code and trace vulnerabilities in it, the NGO says.

The NGO’s position was written by the movement’s CEO lawyer Rachely Edri and lawyer Or Sadan, with the assistance of the students Shir Toledano, Shahar Mandil and Igor Bistrov from Sadan’s freedom of information clinic in The College of Management Academic Studies. Students from the cyber human rights clinic at Haifa University, guided by Dr. Dalit Ken-Dror, also assisted.

The Supreme Court returned the case to the District Court to complete the process.

When an Israeli Farmer Declared War on an Algorithm