Unit 4: Ethical and Legal
Considerations in GIS
1. Definition
Ethical and Legal Considerations in GIS refer to the framework of moral principles,
professional standards, and legal regulations that govern the collection, storage, analysis,
use, and presentation of spatial data. As GIS technology becomes more powerful and
pervasive, it raises significant questions about individual privacy, data accuracy, potential
for bias, and social equity, which practitioners must navigate responsibly.
2. Explanation
GIS is not a neutral tool. The data and analysis it produces can have profound, real-world
consequences. The old adage, "With great power comes great responsibility," is central to
GIS ethics.
A. Privacy: The Core Ethical Challenge
This is the most significant ethical issue in modern GIS. Location data is deeply personal.
● "Geoprivacy": This is privacy in a spatial context. Your location (where you live, work,
travel) can reveal more about you than almost any other piece of data, including your
political beliefs, religious affiliations, and health status.
● Data Aggregation vs. Anonymity: For decades, the primary defense against privacy
violations was "aggregation"—showing data at a coarse level (e.g., zip code or county)
rather than at the individual level.
● The Problem: Modern, high-precision data (like cell phone location, GPS tracks, or
geotagged social media posts) makes it trivial to "re-identify" individuals, even in
supposedly "anonymized" datasets. Knowing just a few "home" and "work" locations
can uniquely identify a person.
● Ethical Question: Just because you can track individuals, should you?
B. Data Accuracy, Quality, and Liability
Maps are perceived by the public as being authoritative and "true." This creates a special
burden of responsibility for the GIS professional.
● Accuracy vs. Precision: Data can be very precise ([Link], to 10 decimal places) but
wildly inaccurate (wrong). A GIS professional must understand and document the
source, age, and limitations of their data.
● "Fitness for Use": Is data collected for one purpose (e.g., tax assessment)
appropriate for another (e.g., modeling a flood's path)? Using data unfit for its
purpose is unethical and dangerous.
● Legal Liability: What if a map is wrong and causes harm?
○ An ambulance is routed to the wrong address by a faulty map.
○ A 911 dispatch system fails, and someone dies.
○ A faulty flood-zone map leads to homes being built in a high-risk area.
○ GIS professionals and their organizations can be held legally liable for the
consequences of their errors.
C. Bias in Data and Algorithms
GIS analysis can both reveal and amplify societal biases.
● Data Bias: If the input data is biased, the output will be biased. This is the "garbage
in, garbage out" principle.
○ Example: "Predictive policing" models are often trained on historical arrest data.
If this data reflects biased policing practices (e.g., over-policing certain
neighborhoods) rather than actual crime rates, the AI model will "learn" this bias
and recommend that police be sent back to those same neighborhoods, creating
a discriminatory feedback loop.
● Algorithmic Bias: The design of the analysis itself can be biased.
○ Example: How do you define "access to a park"? Is it a simple 1-mile buffer
(which treats everyone in the circle equally)? Or does it account for real-world
barriers like highways or a lack of sidewalks, which disproportionately affect the
elderly or disabled? The simple buffer "hides" this inequity.
D. Access and the "Digital Divide"
Who has access to GIS data and technology, and who is left out?
● Access to Data: Is spatial data, often funded by taxpayers, made available to the
public for free? Or is it kept behind paywalls, accessible only to corporations and
wealthy institutions?
● Access to Technology: GIS analysis requires powerful software and technical
training. Communities that lack these resources may be unable to advocate for
themselves (e.g., by creating their own maps to counter a developer's proposal).
● Representation: Whose data is on the map, and whose is missing? (e.g., "mappers"
have historically failed to map informal settlements or indigenous lands).
3. Examples
● Privacy: A data broker buys "anonymized" cell phone location data, re-identifies
individuals, and sells lists of "people who visited a cancer clinic" or "people who
attended a political protest" to advertisers or political campaigns.
● Liability: A county releases a simple web map of property lines. A homeowner uses
this non-survey-grade map to build a fence, accidentally building it on their
neighbor's property, leading to a lawsuit. The map should have had a clear disclaimer.
● Bias: A city uses GIS to find the "optimal" location for a new landfill. The model's
criteria (e.g., "cheap land," "far from wealthy neighborhoods") systematically place all
candidate locations in or near low-incom, minority communities. The analysis is
technically correct but ethically disastrous.
● Access: A wealthy neighborhood uses GIS to model traffic and successfully lobby the
city to block a new bus route. A poor neighborhood, lacking those technical
resources, is unable to effectively lobby for the same route.
4. Diagrams and Visual Concepts
● The "Modifiable Areal Unit Problem" (MAUP):
○ Show two maps of the same data (e.g., voting patterns).
○ Map 1: Data is aggregated by county. It looks like the "Blue" party won.
○ Map 2: The exact same data is aggregated by zip code. It looks like the "Red"
party won.
○ This shows how simply changing the shape of the analysis (the "unit") can
completely change the result, a key ethical issue in data presentation.
● "Redlining" Map:
○ Show a historical "redlining" map of a city (where banks refused to lend).
○ Show a modern map of the same city showing "areas of high poverty" or "poor
health outcomes."
○ The striking similarity of the two maps visually demonstrates how historical,
data-driven biases (ethics) can have-lasting, multi-generational consequences.
5. Important Terms
● Geoprivacy: The right of an individual to control their own location data.
● PII (Personally Identifiable Information): Any data that can be used to identify a
specific individual. Location is increasingly considered PII.
● Liability: Legal responsibility for the consequences of one's actions or products (e.g.,
a faulty map).
● Fitness for Use: A concept that data is only "good" or "bad" relative to the specific
purpose for which it is being used.
● Algorithmic Bias: Systematic and repeatable errors in a computer system that create
unfair outcomes, such as privileging one arbitrary group of users over others.
● Digital Divide: The gap between those who have access to information and computer
technology and those who do not.
● MAUP (Modifiable Areal Unit Problem): A statistical bias in spatial analysis where
the results depend on the shape and scale of the aggregation units (e.g., counties vs.
zip codes).
6. Conclusion
Ethical and legal considerations are not an optional add-on to GIS; they are a core
competency. As GIS practitioners, we are not just technicians; we are storytellers and data
custodians with a significant influence on policy and public perception. A responsible GIS
professional must proactively ask not only "Can we do this?" but also "Should we do this?"
and "Who is being harmed, and who is being helped?"