Law & Open Gov Data

The Home Page for GW Law 6351 Reading Group: Open Government Data

Source Code on GitHub

Syllabus – 6351: Reading Group (Open Government Data).

The George Washington University Law School (Spring 2018)


Federal, state, and local governments have increasingly adopted open-data policies (e.g., OMB Memorandum M-10-06), created new open-data portals (e.g.,, and established new open-data offices (Chief Data Officers, Chief Innovation Officers, etc.), all with an eye toward promoting open-government data. This reading group will explore several key legal and technological issues related to open government data, discuss the implications of open data for personal privacy and on civic society, and examine the effect that open data may have on the practice of law.

Statement of Learning Outcomes. A student who completes this course should be familiar with major laws and policies surrounding open data in the United States. Additionally, the student should understand the major political and practical challenges that shape open-data policy today and in the years to come.

Contact Information. V. David Zvenyach,, (202) 596-1770

Pre-Requisites. There are no pre-requisites for this course.

Written Requirement. In keeping with the norm in the open-data “space,” all participants must write reaction essays in the form of a “blog post” based on the reading assignments or class discussion.

With the exception of the first post, the posts should be based on the reading materials for the upcoming class. So, for example, the post for session #4 will be based on the reading materials on “Privacy and ‘Big’ Open Data” and will be due by midnight two days class (see “Deadlines and Late Work”). The first post will be due by midnight two days before the second session.

Each post should be roughly 500-1,000 words. And even though iterative drafting and commenting is highly encouraged, the post will be evaluated based on the last version before the deadline.

Deadlines and Late Work. All essays must be turned in by midnight two days before class. An essay that is late will have 4 points deducted for each day. For example, if class is held on Wednesday, the reaction essay for that class will be due at midnight on Monday.

Grading Rubric. The course will be graded out of 100 points. Each of 6 essays will receive a maximum of 10 points, based on the technical and substantive quality of writing. The remaining 40 points will be based on participation in class and through peer review of others’ essays. In order to pass the course, you will need 65 points. This course is graded on a CR/NC basis.

Class Attendance. As reflected in the GW Law School Bulletin, “[r]egular class attendance is required and is necessary for successful work. A student who is deficient in class attendance or participation will, after the instructor or [D]ean of [S]tudents attempts to communicate with the student, have a grade of No Credit (NC) entered on the record absent an excuse. (Here, as elsewhere in the Bulletin, email correspondence to a student’s official Law School email address is one fully acceptable means for student notification.) No excuse for deficient attendance or participation will be granted except by the dean of students and then only upon proof of unexpected serious illness, injury, or other emergency. A student whose excuse is accepted by the dean of students will be withdrawn administratively from the course.” (GW Law School Bulletin, p. 19 & 39).

Disability Support Services. Any student who may need an accommodation based on the potential impact of a disability should contact the Office of Disability Support Services (DSS) at: 202-994-8250 in the Rome Hall, Suite 102, to establish eligibility and the Dean of Students Office at 202-994-8320 to coordinate reasonable accommodations. For additional information please refer to: Remember, Law School examination protocol calls for anonymous grading and disclosure of disability to a professor has the potential to breach exam anonymity. Students may contact the Dean of Students Office regarding registering with the Office of DSS or eligibility to receive accommodations (e.g., notetaking assistance, adaptive technologies, etc.).

Academic Integrity Policy. Students must strictly adhere to the GW Law School’s Academic Integrity Code (see GW Law School Bulletin) and publication Citing Responsibly, and the University’s Code of Student Conduct.

Recording of Classes. This course will follow the Law School’s “Class Recording Policy,” available at the Dean of Students Office Website. Essentially, students may request class recordings when they will be absent for religious reasons, family emergencies, and other authorized absences. Requests for recording and questions about the policy should be directed to the Dean of Students Office.

Session 1: Introduction to Open Government Data (Jan 17, 2018).

What is open government data? What makes open government data different from preexisting transparency-related efforts, such as the Freedom of Information Act? How is open government data used? What role do governments play—and what role should they play—in collecting, disseminating, and using open government data?

Guest Lecturer

Daniel Schuman

Reading Materials

Session 2: Understanding the tools of Open Government Data (Jan 31, 2018).

This session will address many of the important tools used related to the use of open government data, including: APIs and datasets, web scrapers, regular expressions, natural language processing, and version control (git/dat).

Guest Lecturer

Bill Hunt

Reading Materials

Session 3: The DATA Act (Feb 14, 2018).

In 2014, Congress enacted the DATA Act, which mandated the establishment of government-wide data standards for financial data and required publication of ‘consistent, reliable, and searchable government-wide spending data that is displayed accurately for tax-payers and policy makers.’ Three years later, the DATA Act is now the law of the land and has been implemented by the federal government. This session will explore how the DATA Act came to exist and what implementation of major open-data legislation looks like.

Guest Lecturer

Renata Maziarz

Reading Materials

  • DATA Act of 2014, Report of the Senate Committee on Homeland Security and Government Oversight
    • The report describes several of Congress’s “lessons learned” from past experiences. What were some of these lessons learned?
  • Matt Rumsey, A brief history of the DATA Act, Sunlight Foundation (May 8, 2017)
    • The author describes the origins of the DATA Act as beginning with the Federal Funding Accountability and Transparency Act of 2006 (FFATA). What were some of the shortcomings of FFATA? How did the DATA Act propose to overcome those limitations?
    • Note that FFATA was sponsored by then-Senator Barack Obama, but the article claims that President Obama’s Office of Management and Budget attempted to weaken the DATA Act. What could explain this?
    • What were some of the techniques that the government used to help implement the DATA Act? Is the process of implementation relevant to the efficacy of Open Government Data? How?
  • Andrew Prokop, Beating the odds: Why one bill made it through a gridlocked Congress — and so many don’t, Vox (May 22, 2014)
    • The author describes the origins of the DATA Act as beginning with the stimulus bill – the American Recovery and Reinvestment Act of 2009 – and the implementation of standards by Earl Devaney. Remember, that Rumsey suggested that the origins were grounded in FFATA. What do you make of these competing origin stories? What do origin stories suggest about the goals of legislation? What do they suggest about the authors’ interpretation of those goals?
    • The article describes some of the interest group politics involved. What were their incentives? What was the impact of their advocacy? What about the executive branch’s incentives? What was the impact of their advocacy?
    • The author suggests that the result of the legislative process was a weakened law. What does this suggest about how how to approach open-government-data policies?
  • Becky Sweger, How DATA Act implementation is opening up federal spending, 18F (June 9, 2015)
    • The article describes some of the core challenges associated with DATA Act implementation. To what extent did these challenges inform the drafting of the legislation?
    • What does the author propose as some major use cases for the implementation? What questions can the data answer?
  • The DATA Act – Working Towards Federal Spending Transparency, GAO WatchBlog (Nov 9, 2017)
    • What did the GAO find in its report about data accuracy and completeness?
    • What were some examples of challenges concerning consistency of presentation of data? To what extent do they parallel earlier “lessons learned?” What does that suggest?
    • To what extent were the limitations in the accuracy and completeness of the data affected by the legislative process itself?
  • DATA Act 2022: Changing Technology, Changing Culture (May 2017)
    • Take a look at the document’s vision for Federal Spending in 2022. What are the suggested primary benefits associated with the DATA Act? Who are the beneficiaries. To what extent is that vision consistent with the original stated intent of the DATA Act?
    • The DATA Act took 3 years to implement from passage in 2014 to first public reporting in 2017. Total implementation is supposed to be complete in 2021. What are your predictions about what happens in 2022?
    • What are the challenges with continued implementation of the DATA Act? What are the solutions?
    • In the next session, we will discuss what level of responsibility the government has to provide “context” to data it publishes and to explicitly engage consumers of data. What are the authors’ views on the government’s responsibility to engage the public around the data?

This session will discuss the key concepts behind successful open-data policies and will cover issues surrounding ownership of government data, including claims to copyright, use of terms of service, and data licenses.

Guest Lecturer

Josh Tauberer

Reading Materials

Session 5: Privacy, Law Enforcement, and “Big” Open Data (Mar 14, 2018).

One of the most commonly cited concerns about open government data is personal privacy and the “mosaic” effect of big data. What are the social implications with collection and dissemination of government data? How should governments address privacy when thinking about open government data? How does law-enforcement’s increasing reliance on surveillance technologies figure into open data?

Guest Lecturer


Reading Materials

  • Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. Rev. 1701 (2010)
    • This article argues that “[d]ata can be either useful or perfectly anonymous but never both.” Why is that the case? Why has anonymity been an important value for data management?
    • How does reidentification work? Why is reidentification a challenge to implementation of traditional privacy law frameworks? What are some examples of how reidentification has defeated privacy goals?
    • What options do policymakers have in terms of addressing the core challenges created by reidentification?
    • What does the article suggest should be the “test” for data handling?
  • Latanya Sweeney, Simple Demographics Often Identify People Uniquely, Carnegie Mellon University (2000)
    • Did the findings in the paper surprise you?
    • This article is almost 20 years old and was published seven years before the release of the iPhone. Do you suspect that the analysis would lead to the same results today?
    • What are some of the techniques described in the article? How complicated do they seem?
  • Open Data Privacy Playbook (Feb. 27, 2017)
    • Similar to Ohm’s article, the document outlines the tension between utility and risk. What are the authors’ conclusions about the effectiveness of “PII” as a privacy tool?
    • In lieu of PII, what does the document recommend with regard to data publication?
    • Reflect on the framework offered by Yu & Robinson’s “New Ambiguity of Open Government.” To what extent does that framework interact with the document’s recommendations? What is a responsible data steward to do?
  • Elizabeth Joh, Five Lessons From the Rise of Bodycams, Slate (Nov. 28, 2016)
    • What are the lessons learned from the use of body cameras by law-enforcement officials?
    • The author argues that “without regulations or guidelines, body cameras are becoming all-purpose surveillance tools.” What sort of regulations or policies are discussed by the author? What else would you add to the list, given what you know about open government data thus far?
  • Ellen Nakashima, Secrecy around police surveillance equipment proves a case’s undoing, Wash. Post (Feb. 22, 2015)
    • What’s a Stingray? How widely have they been used?
    • What has been the reaction of the courts?
    • What is Accurint? Here’s a published price list in Florida. Take a look at what is available for purchase. Anything surprising?
  • EFF, Street-Level Surveillance, Automated License Plate Readers
    • What’s an ALPR? How can they be used?
    • What’s Vigilant Technologies? How are ALPR databases used?
    • What are the threats posed by ALPR? How have these technologies been abused?
  • Emily Shaw, The local projects that are making police complaint data open and accessible, Sunlight Foundation (Oct. 25, 2016)
    • The article points to several examples of increased access to police complaint data. What is different between the different efforts across cities?
    • What is the role of civil society in reviewing and monitoring police complaint data?
    • What are some of the findings from the existing efforts to release police-complaint data?
  • Emily Shaw, Using data to track police response to sexual assault, Sunlight Foundation (Aug. 31, 2016)
    • As the author notes, “most police information about sexual assault cases is protected.” How does she get the data described in the article? How do these techniques inform how other open-government data inquiries can be carried out?
    • As you read through this article, consider how public data about law-enforcement techniques provides accountability for law-enforcement. To what extent does data about the government’s actions help promote better government? Again, what is the role of civil society in this?
    • Over the semester, we’ve discussed the incentives for public officials related to open government data. What are the incentives for law-enforcement agencies?

Session 6: Efficacy of Open Government Data (Mar 28, 2018).

This session will explore whether open government data policies are living up to their promise and ask what should be expected from an open government data policy? What are the successes and failures for open government data? How are cities and states using open government data? What’s likely to come next for open government data?

Guest Lecturer

Rebecca Williams

Reading Materials

  • Vanessa Williamson and Norman Eisen, The impact of open government: Assessing the evidence, Center for Effective Public Management (Dec. 2016)
    • The authors describe a very specific definition of what constitutes an effective open government initiative. What are the criteria? Consider Yu & Robinson’s framework. To what extent have the authors assumed this framework in their analysis? What tradeoffs does this imply? To what extent are the frameworks different?
    • How persuasive do you find the concept of the principal-agent model? How does that model correspond to the “Invisible Hand” concept?
    • Consider the 14 Principles of Open Government Data. To what extent is the framework advanced by authors similar or different from the 14 Principles?
  • Tom Lee, Open Data: Better Politics, Winning Politics… But Still Politics, Crooked Timber (July 6, 2012)
    • The post argues that open data is “pre-political”, or in the alternative, that it’s “good politics.” Do you agree? Why?
    • The article describes several objections to open data policies. In this class, we’ve discussed other barriers to open government data. How do you see those barriers playing out over time?
    • What does the post argue about what the right levels of expectation ought to be for open data policies?
    • The post contends that the open-data movement’s long-term outcome will depend on politics. True?
  • Lauren Kirchner, New York City Moves to Create Accountability for Algorithms, ProPublica (Dec. 18, 2017)
    • As the article notes, algorithmic transparency legislation is “cutting edge.” What does the bill in question actually do?
    • What are the origin stories involved in the legislation? What are the politics? To what extent are the politics in play similar or different from other open-government-data policies?
    • One of the findings cited was that the algorithm that prompted the legislation had “a margin of error of 30 percent for one key input of the program.” To what extent is the desire for algorithmic transparency connected to the accuracy of the algorithm? What are the implications of that relationship? What lessons should we draw from Paul Ohm’s article as it relates to this relationship?
  • Robert Brauneis and Ellen Goodman, Algorithmic Transparency for the Smart City, Yale J. L. & Tech. (2017)
    • What is the authors’ core critique around the need for algorithmic transparency?
    • The authors identify 3 impediments to algorithmic transparency. What are they? To what extent do they parallel other open-government challenges? What does that imply about how these impediments might be addressed?
    • What politics are embedded in use of algorithms? To what extent are these politics new or different?
    • What were the authors’ experiences in using open-records laws to obtain information about algorithmic transparency? Why does this matter? What are the authors’ solutions to these experiences?
    • What do the authors suggest should be documented with regard to algorithms?