In this edition of InfoRM:
Vexatious or frivolous information requests – what does this really mean?
Refusing to disclose information where the information request is vexatious or frivolous is common to the Official Information Act, the Local Government Official Information and Meetings Act, and the Privacy Act. The test can be difficult to navigate, but two recent developments have sought to provide more clarity.
The first is the recent decision of the High Court in Attorney-General v Dotcom. The Attorney-General appealed a Human Rights Review Tribunal (HRRT) decision that concluded (amongst other points) that Mr Dotcom's urgent Privacy Act request to every Cabinet Minister, and almost all Government departments, for all personal information held about him was not vexatious or frivolous. The Court reaffirmed that the test for determining whether a request is vexatious is objective and noted there did not need to be proof of impropriety or serious misconduct on the part of the requester. The Court focused on the context for the request, being Mr Dotcom's eligibility proceedings for extradition to the US and the urgency component of the request being due to those proceedings. The Court stressed that an "everything" request is perfectly valid, but in this case the breadth of the request combined with the requested urgency, was sufficient to meet the vexatious threshold. The Court considered the request would not have been vexatious had Mr Dotcom withdrawn the urgency element.
The Court took into account a wide range of factors, which aligns with the theme in the guide released by the Office of the Ombudsman to assist agencies in deciding whether they can decline an Official Information Act request. The guide reminds agencies there is a high threshold for declaring a request frivolous or vexatious, as the starting principle is always to make the information available unless there is good reason for withholding.
Considerations that an organisation could have regard to include whether the request:
- imposes an excessive and unreasonable burden on the agency;
- lacks any serious purpose or value, and is more than just an inconvenient or annoying request;
- is intended to cause disruption, irritation or distress to an agency or its staff (mere scrutiny of an agency or its staff is not enough);
- causes unreasonable harassment of, or distress to, staff;
- has a particular history and context suggesting the requester's approach has gone beyond what is reasonable, and has become excessive and disproportionate; and
- could be declined on other grounds (in which case those grounds should generally be used).
Agencies should remember that it is the request, not the requester, that must be vexatious, and each request should be considered on a case-by-case basis. The guide recommends that this ground should be considered by those at a senior level within the organisation and their reasoning and evidence for that reasoning documented.
Australian Government introduces a Bill to create a national facial recognition database
The implications of using facial recognition software is an ongoing debate both here and in Australia, where the Australian Government has introduced a bill that would allow the creation of a nation-wide facial recognition system.
The Identity-matching Services Bill 2018 (Bill) provides for information sharing between the Commonwealth, state and territory governments by creating a nation-wide database of people's physical characteristics and identities, and integrating them with a facial recognition system. The system would initially enable centralised access to passport, visa, citizenship and driver's license images, with government agencies then allowed to submit images to verify someone's identity (Face Verification Service), or alternatively, use it to identify an unknown person (Face Identification Service).
Various stakeholders have criticised the Bill for its serious implications for human rights, with the key themes being:
- there are few limits on data collection and use, allowing disclosure of personal information for a broad range of purposes. For example, the Face Identification Service may reveal not only who an unidentified person is, but who they are in contact with, when and where. This has the potential to facilitate broad tracking and profiling, which could itself engage other human rights, such as freedom of expression and freedom of association.
- the perceived lack of adequate and effective safeguards contained in the Bill, particularly around the circumstances in which agencies can access the system, and the amount of information they have access to.
- the potential for private sector access to this information, with stakeholders questioning whether this is appropriate and whether the safeguards provide adequate protection against misuse of information.
Accessibility of personal information – Naidu v Royal Australian College of Surgeons
The HRRT has recently commented on what information agencies must also provide in order to ensure personal information given to individuals is easily accessible.
Dr Naidu requested personal information from the Royal Australasian College of Surgeons (RACS) relating to his scores from a training programme. RACS provided a table listing the scores from the referees but not the scoring formula or mechanism explaining the scores.
The HRRT concluded the scoring formula or mechanism should have been provided to Dr Naidu, despite it not being personal information, as it was a condition precedent to being able to access the personal information in the table. Without the formula, access was not given in terms of privacy principle 6 of the Privacy Act 1993, which requires "meaningful access", or information provided in a way that can be comprehended, according to the Privacy Act.
Interestingly, the HRRT supported its interpretation of "meaningful access" with reference to the General Data Protection Regime (GDPR). Although New Zealand is not a member of the European Union, New Zealand is designated under the GDPR as a country providing "essentially equivalent" data protection. The HRRT stated that given New Zealand's designation, it is only appropriate that wherever reasonably possible, consistency should be achieved between New Zealand privacy laws and the GDPR. In regards to access to personal information, the GDPR requires the standard of communication to be "in a concise, transparent, intelligible and easily accessible form, using clear and plain language".
Around the world of privacy:
Interpretation of personal information
In another interesting case on what constitutes personal information, the European Court of Justice (ECJ) has commented on the status of a candidate's answers on an examination script. Mr Nowak failed an examination set by the Institute of Chartered Accountants of Ireland (CAI), and so requested access to all personal data held by the CAI. The CAI refused to send his marked examination script on the grounds it did not contain personal data.
The ECJ concluded that the written answers submitted by a candidate and any comments made by an examiner constitute personal data as the content of the answers reflects the extent of the candidate's knowledge and competence in the field. Any comments made by the marker are also personal information because they constitute information relating to that candidate.
This case illustrates the usefulness of privacy legislation in accessing personal information to be used by individuals as part of appealing unfavourable determinations.
The right to be forgotten (for one plaintiff)
The English High Court has heard its first "right to be forgotten" case, where Warby J had to consider whether the claimants had the right to have personal information "delisted" or "deindexed" by the operators of internet search engines.
The claimants were businessmen who were convicted of criminal offences. One claimant, referred to as NT1, was involved in a controversial property business, was convicted in the late 1990s for criminal conspiracy to account falsely, and sentenced to a term of four years imprisonment. The other claimant, referred to as NT2, pleaded guilty to two counts of conspiracy to carry out surveillance in the early 2000s. He was sentenced to six months imprisonment. NT2 was successful in getting his personal information removed, while NT1 was not.
Warby J focussed on a number of factors related to the offending history and events since conviction in coming to the different conclusions. For NT1, Justice Warby concluded the information had been legitimately available for many years and there had only been a modest interference with his right to respect family and private life. The information remained relevant as NT1 continued working in the same industry, it informed of his past dishonesty, and NT1 has failed to acknowledge his guilt and his misstatements to the public and Court. For NT2, Justice Warby was satisfied the evidence showed the article about NT2 was misleading as to the nature and extent of his criminality, and falsely suggested that NT2 made criminal proceeds and dealt dishonestly. He also found NT2 did not contest his charge, expressed genuine remorse, and his current business activities are so different from what he previously engaged in that his past offending has little, if any, relevance to anybody's assessment of his suitability.
This case illustrates how fact-specific decisions by the Court will be in determining whether a claimant can exercise their "right to be forgotten", and highlights the difficulties search engines and data protection authorities have with processing de-listing requests.
This publication is intended only to provide a summary of the subject covered. It does not purport to be comprehensive or to provide legal advice. No person should act in reliance on any statement contained in this publication without first obtaining specific professional advice. If you require any advice or further information on the subject matter of this newsletter, please contact the partner/solicitor in the firm who normally advises you, or alternatively contact one of the partners listed below.