| Dokumendiregister | Andmekaitse Inspektsioon |
| Viit | 1.2.-2/26/1671-1 |
| Registreeritud | 21.04.2026 |
| Sünkroonitud | 22.04.2026 |
| Liik | Sissetulev kiri |
| Funktsioon | 1.2 Asjaajamine |
| Sari | 1.2.-2 Üldkirjavahetus |
| Toimik | 1.2.-2/2026 |
| Juurdepääsupiirang | Avalik |
| Juurdepääsupiirang | |
| Adressaat | Apple |
| Saabumis/saatmisviis | Apple |
| Vastutaja | Pille Lehis (Andmekaitse Inspektsioon) |
| Originaal | Ava uues aknas |
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
1 of 12
Maps Data Collection Methodology:
Pedestrian
April 2026 – June 2026
Document prepared for regulatory contact purposes only. Strictly Confidential.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
2 of 12
EXECUTIVE SUMMARY
Overview
Since 2015, Apple Maps has collected data in various parts of the world aimed at improving the product quality
of Apple Maps. To date, this program has largely involved collecting geospatial data using vehicles that are
outfitted with sensor systems. These vehicles are clearly marked with Apple Maps branding and have been
sent across the North America, Europe, Asia, and Australia.
To compliment the vehicle program, Apple has developed complementary portable systems geared toward
improving Apple Maps in places that are inaccessible to the Apple vehicles (for example, blocked streets,
pedestrian corridors, parks, pavilions, and public trails).
Apple’s pedestrian collection program is separate from the vehicle collection program, although the types of
sensors used are basically the same. Consistent with our automobile collection efforts, Apple takes steps to
protect the privacy of individuals. The project has always taken full account of previous guidance from
regulators including the former EU Article 29 Data Protection Working Party on the collection of mapping
imagery for publication, and also in light of discussions with relevant Data Protection Authorities with whom we
have been in contact with prior to Apple commencing data collection for the described purposes. The
collections in Europe were and will be carried out by and under the control of Apple Distribution International
Ltd in Cork and will adhere to the GDPR and European data protection requirements as we understand them.
Apple uses a backpack system with a sensor array worn by a surveyor for the “pedestrian collections”
described in this document. In 2019, Apple successfully tested the backpack systems at several prominent
locations in the United States, Asia, and Europe. These collections were done in consultation with the Data
Protection Authorities in each country. Apple hopes to expand this pedestrian collection effort to more areas.
When combined with the imagery collected from Apple Maps vehicles, the result of the pedestrian collections is
a visual representation and navigable map available within the Apple Maps application and Apple Maps
framework. Apple will consult with the Data Protection Authorities before the imagery is published. The
publication of imagery would be similar to that available in Apple’s Look Around feature
https://support.apple.com/en-ie/guide/iphone/iph65703a702/ios.
This methodology document gives an overview of Apple’s pedestrian collection program and contains a
proposed schedule for upcoming work in your country. In particular, this overview is comprised of the following
sections:
• Map Build Components
• Backpack System
• Data Collection Systems
• Security
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
3 of 12
• Appendix 1: Photographs of the Backpack System
• Appendix 2: Terminology
• Appendix 3: Planned Collections Dates and Locations: Backpack System
Map Build Components
In improving the quality of Apple Maps, Apple’s pedestrian systems use sensors to collect the following types of
data:
• GPS traces (heading, latitude, longitude and altitude)
• 2D still imagery
• LiDAR (backpack system only)
A detailed summary of the purpose for each data type follows:
GPS Traces
The GPS trace information is a bundle of data containing system heading, latitude and longitude (position), and
altitude. This set of precise measurements is the core data set that will allow the map production team to build
an accurate representation of the path walked by the surveyor. The elements of heading, latitude, and
longitude have been collected since the very first map data generation systems were deployed nearly 30 years
ago.
This information is essential for this collection, as our goal is to improve the accuracy of areas that could not be
traveled by Apple vehicles, including, but not limited to narrow streets, blocked streets, pedestrian corridors,
parks, pavilions, and public trails. Many of these components are also leveraged in other data collection efforts,
and as such there may be similarities.
Another sensor to aid the navigation data is the Inertial Measurement Unit (IMU). The data from this source is
combined with the GPS traces to give an accurate picture of how the system has moved over time.
There are many manufacturers of devices which collect this type of information. For example Maxon and
Vectornav distribute a system that is leveraged for such purposes in the industry field.
2D Still Imagery
2D still imagery is a highly effective tool used in map data production, which allows for data editors to append
key ‘attributes’ to the GPS trace information. Examples of attributes include, but are not limited to:
• Street and pathway geometries
• Street signs
• Transition zones (e.g., between pedestrian walkways and streets)
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
4 of 12
• Barriers
• Landcover (e.g., grass, dirt, concrete)
These are all details which enable the map database to become a truly accurate representation of pedestrian
areas as they correlate to the street network. This enables consumers to enjoy a reliable navigation experience
while changing from different modalities during a trip (for example, starting in a vehicle, then walking, then taking
public transit). Without such attributes correlated to the shape and position of these pedestrian areas, it becomes
exceedingly difficult to provide map display, search, routing and navigation experiences of benefit to the
consumer. In some respects, attributes are a form of metadata. Additionally, raw imagery will be used only to
improve the algorithms that blur faces and license plates in images published in Look Around feature. Only
imagery that has gone through the privacy blurring process described below will be used to develop and improve
other Apple products and services, which may include using data to train models, such as models related to
image recognition, creation, and enhancement. When imagery is used to improve products and services outside
of Apple Maps, Apple ensures images used from each specific location were taken at least six months apart in
order to maintain privacy.
LiDAR
LiDAR (Light Detection and Ranging) is a technology which is a parallel to RADAR, but rather than using radio
signals to detect shape and form, it uses pulses of light to detect shape and form. This technology is commonly
used by digital map makers across the industry field. For the purposes of Apple’s map data collection, LiDAR is
used to establish the height, width and depth of buildings and other structures for multi-dimensional
representation.
LiDAR’s primary functions are:
• Augmenting the position of ‘attributes’, relative to that of other objects.
• Bringing a rough outline of shape to the object being detected, in order to provide confirmation of the
object type should it be unclear.
• Providing specific dimensions for crucial attributes such as wall heights, structural divisions and
transitional areas.
The output of collected LiDAR data is often referred to as a ‘point cloud’ and can only be viewed through very
high end, production level software applications. LiDAR data is commonly used as an ingredient for building
map databases to enable products such as routing and navigation, and for providing three dimensional depth
to objects such as buildings.
The raw LiDAR data is primarily used to help assess the position of objects relative to each other or a different
fixed position. For example, the distance between two walls or the full length of a hallway. LiDAR is also used to
help reconstruct shapes of objects to ensure that the imagery is properly shaped and oriented.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
5 of 12
Most data is formatted by the number of points generated in a particular pulse or reading, some sample output
is:
• Point count: 3,249,019
• ZMin: 302.88
• ZMax: 713.54
• Scan Angle: -5, 21
Data Collection Systems
Backpack System
The backpack system is a frame constructed as a backpack which is worn by the surveyor. The system
captures data about its surroundings as the surveyor walks. This type of portable system has been used by
other mapping companies for some time.
The backpack camera array consists of 5-8 still cameras capturing still imagery (1 Frame Per Second) and 1- 3
LiDAR units (positioned on the side of the frame or under the cameras, collecting dimensional information to
include the distance between two walls or the full length of a hallway) and 1 positioning system (GNSS, IMU).
To capture accurate positioning, the sensor uses 2 GNSS antennas. To capture the imagery during the
collection, the system contains a computer which controls the flow of information throughout the backpack.
The entire system is powered by 2 batteries. In addition, an iPhone may be used to run diagnostics on the
collection system and to perform calibration.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
6 of 12
Security
Backpack System
The data collected by the backpack system is stored on a solid state drive (SSD). The state of the SSD drive is
fully tracked. When full, the SSD is shipped to Apple’s upload facility, copied onto servers, then securely
deleted, and shipped back to the field for re-use. Apple has followed this process for the vehicle data collection
program since 2015 and it has proved to be secure and successful.
The tracking system monitors the SSDs in multiple states, including:
• Idle/at upload facility
• In transit for use
• Deployed and not in use
• Deployed and in use
• In transit for data ingestion
• Data removed for ingestion
All collected data is encrypted to the SSDs using 128-bit AES with PKCS7. The private encryption key is only
held by Apple. It can be changed and a new key deployed within 24 hours.
All software used in the data collection process runs on Apple hardware and is also developed by Apple. It
provides monitoring, insight into the settings for the cameras and alerts the operator if there are errors in the
system.
Once on Apple premises the data follows standard Apple security protocols
• Data is only stored on physically and logically secured servers
• Access to the data is tightly controlled through Apple access control lists
Image Blurring
As outlined above, before the publication of any form of imagery, steps will be taken to blur faces and license
plates so that they are not identifiable in the published product. As our aim is to only capture objects that are
stationary, our first objective is to remove anything that is in motion. The goal of the product is to remove the
likeness of any person through face blurring. The second objective is to recognize any object we wish to
capture for use in our product. An example of this would be a store front or transitional area such as an
elevator or stairs. As such, Apple strives to remove identifiable information from pedestrian collection data by
blurring faces and license plates. For this, we train the software to recognize what a store front looks like. Upon
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
7 of 12
doing so, these areas are identified and processed as such. Apple uses quality assurance processes, both
automated and manual, to ensure the products are developed to Apple’s privacy standards. Once the imagery
is blurred, Apple may publish the imagery in Look Around and use it to develop and improve Apple Maps and
other Apple products and services, including training models related to image recognition, creation, and
enhancement.
Data Retention
For the purpose of producing a visual representation of streets and buildings, it is intended that all collected
imagery will be retained in its raw collected state for a period of 12 months after the date of collection. This is to
allow for the imagery to be consulted in its raw state while we continue to work on refining this product and our
means for ensuring fully effective blurring of imagery including during collection. Where requested by a data
protection authority, these periods are varied on a per country basis to ensure alignment with local
requirements.
The chosen period reflect the reality that elements of these products will continuously remain under active
development and refinement with a precise date of release of some elements to our customers yet to be
finalized. We release products to our customers when they meet the exacting quality standards that we set for
ourselves and which our customers expect. The proposed retention periods for the imagery reflect the reality
that it is our raw material for developing our product to the highest standards.
Query Handling
Apple has a dedicated team that handles queries or requests from the public. Apple responds to such queries
within a short period in accordance with GDPR and other applicable privacy laws. All individuals can contact us
via the contact form which is available from our Maps Data Collection website
http://maps.apple.com/imagecollection/. Requests received through our standard privacy contact form are also
handled.
In the event of publication of images, users who wish to report concerns can use our Report an Issue feature in
Apple Maps. This is our preferred method as it allows us to fully track requests and respond on the precise
image of concern which is submitted with the flow when used by a user. Non-Apple users can use the Report
an Issue feature using Maps on their web browser by going to maps.apple.com. Where a person does not wish
to pursue either of these options, Apple’s contact email address will remain available.
Requests for access to or deletion of raw imagery by an individual are processed by our team upon the
provision of appropriate identification of the location where an image was likely collected and the timing within a
15 minute time window which is solely to allow us to try to find the imagery in question.
Public Notice
For the backpack system collections, we update our Maps Data Collection website
http://maps.apple.com/imagecollection/ to inform individuals of the areas in which we will be collecting images.
This website contains a detailed map view of the streets where collection will take place. An example of this
map view is found in Appendix 3.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
8 of 12
Backpack systems are clearly marked with Apple Maps branding so that individuals with questions can raise
them. Individuals conducting these collections will wear Apple branding with a link to our Maps Data Collection
website.
Video
Apple does not collect video as part of the pedestrian data collections.
Image Publication
The primary goal of this map data collection is to improve the quality of the Apple Maps database. Apple may
publish the images collected from the backpack system in some form at a future date if the collection is
successful. Any publication will follow the privacy and security protocols described in this document, which
includes the blurring of faces and license plates.
Plans for Future Collections
Apple is planning future data collections using this methodology in accordance with the schedule found in
Appendix 3.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
9 of 12
Appendix 1: Photographs of the current Backpack System
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
10 of 12
Appendix 2: Terminology
GPS (Global Positioning System) — a radio sensor which uses a number of satellites to determine the ground
position of an object. The components of this sensor are affixed to the back and top left of the collection
system.
IMU (Inertial Measurement Unit) — sensor package attached to the back frame of the system containing
accelerometers, gyroscopes, and a GPS receiver.
LiDAR (Light Detection and Ranging) — a technology which is a parallel to RADAR, but rather than using radio
signals to detect shape and form, it uses pulses of light to detect shape and form. For the purposes of Apple s
map data collection, LiDAR is used to establish the height, width and depth of buildings and other structures for
multi-dimensional representation.
Appendix 3:
Planned Collection Dates and Locations: Backpack System — Estonia
* the anticipated start date for each city may be adjusted due to unforeseen circumstances, such as weather
conditions.
Sample of Public Disclosure Map for Backpack System Collections:
The areas scheduled for collection with the backpack system will be disclosed to the public in a dynamic map
interface on https://maps.apple.com/imagecollection. The following are screenshots of the view presented to
users.
State or City Approx. Number of
Backpack Systems
Anticipated Start Date* Anticipated End Date
Tallinn 4 2026-04-27 2026-06-05 Tartu 4 2026-04-27 2026-06-03
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
11 of 12
Tallinn (City View)
Tallinn (Neighborhood View)
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
12 of 12
Tartu (City View)
Tartu (Neighborhood View)
|
Tähelepanu!
Tegemist on välisvõrgust saabunud kirjaga. |
Privileged & Confidential
Dear Sir/Madam,
We are reaching out again in relation to our efforts to collect data in Estonia for Maps. Specifically, we are planning to do pedestrian collections in Tallinn and Tartu.
In this context, we are happy to share that we have improved the way we share Maps privacy information across various channels.
Firstly, we added a specific image collections FAQ on the main Maps landing page at https://www.apple.com/maps/. The FAQs are placed right after the Privacy section on this page. The new FAQ asks how Apple collects data and includes another prominent link to the Maps Image Collection page. The response to this question states: "Apple conducts surveys with vehicles and backpacks to build and maintain Apple Maps, support the Look Around feature, and improve other Apple products and services. Your privacy is protected by censoring faces and license plates in published images. You can find more about our collection practices at maps.apple.com/imagecollection.”
Secondly, we have updated the Image Collection country-specific microsites to include information about the purpose of image collections. This is displayed just underneath the page title and is in a lighter colored, bolder font to distinguish it from the rest of the notice. The microsites also contain information in relation to the use of blurred images of train models and improve products and services. Please see the updated page for Estonia at https://maps.apple.com/imagecollection/locations/ee.
Finally, we added a direct link to the Image Collection site to the browser-based version of Maps. This is in the same font, color, and position as the other footer disclosure links: Privacy, Terms of Use, and Legal.
I’m attaching our latest methodology document for pedestrian collections, which also includes the specific dates when we intend to collect in Tallinn and Tartu. As you can see from this document, we intend to start collections on 27 April 2026. This information will also be published on the Estonia-specific Maps website.
Please let me know if you have any questions.
Kind regards,
Michael Schidler
Michael Schidler
Senior Privacy Legal Counsel
Apple | Global Privacy Law and Policy
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
1 of 12
Maps Data Collection Methodology:
Pedestrian
April 2026 – June 2026
Document prepared for regulatory contact purposes only. Strictly Confidential.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
2 of 12
EXECUTIVE SUMMARY
Overview
Since 2015, Apple Maps has collected data in various parts of the world aimed at improving the product quality
of Apple Maps. To date, this program has largely involved collecting geospatial data using vehicles that are
outfitted with sensor systems. These vehicles are clearly marked with Apple Maps branding and have been
sent across the North America, Europe, Asia, and Australia.
To compliment the vehicle program, Apple has developed complementary portable systems geared toward
improving Apple Maps in places that are inaccessible to the Apple vehicles (for example, blocked streets,
pedestrian corridors, parks, pavilions, and public trails).
Apple’s pedestrian collection program is separate from the vehicle collection program, although the types of
sensors used are basically the same. Consistent with our automobile collection efforts, Apple takes steps to
protect the privacy of individuals. The project has always taken full account of previous guidance from
regulators including the former EU Article 29 Data Protection Working Party on the collection of mapping
imagery for publication, and also in light of discussions with relevant Data Protection Authorities with whom we
have been in contact with prior to Apple commencing data collection for the described purposes. The
collections in Europe were and will be carried out by and under the control of Apple Distribution International
Ltd in Cork and will adhere to the GDPR and European data protection requirements as we understand them.
Apple uses a backpack system with a sensor array worn by a surveyor for the “pedestrian collections”
described in this document. In 2019, Apple successfully tested the backpack systems at several prominent
locations in the United States, Asia, and Europe. These collections were done in consultation with the Data
Protection Authorities in each country. Apple hopes to expand this pedestrian collection effort to more areas.
When combined with the imagery collected from Apple Maps vehicles, the result of the pedestrian collections is
a visual representation and navigable map available within the Apple Maps application and Apple Maps
framework. Apple will consult with the Data Protection Authorities before the imagery is published. The
publication of imagery would be similar to that available in Apple’s Look Around feature
https://support.apple.com/en-ie/guide/iphone/iph65703a702/ios.
This methodology document gives an overview of Apple’s pedestrian collection program and contains a
proposed schedule for upcoming work in your country. In particular, this overview is comprised of the following
sections:
• Map Build Components
• Backpack System
• Data Collection Systems
• Security
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
3 of 12
• Appendix 1: Photographs of the Backpack System
• Appendix 2: Terminology
• Appendix 3: Planned Collections Dates and Locations: Backpack System
Map Build Components
In improving the quality of Apple Maps, Apple’s pedestrian systems use sensors to collect the following types of
data:
• GPS traces (heading, latitude, longitude and altitude)
• 2D still imagery
• LiDAR (backpack system only)
A detailed summary of the purpose for each data type follows:
GPS Traces
The GPS trace information is a bundle of data containing system heading, latitude and longitude (position), and
altitude. This set of precise measurements is the core data set that will allow the map production team to build
an accurate representation of the path walked by the surveyor. The elements of heading, latitude, and
longitude have been collected since the very first map data generation systems were deployed nearly 30 years
ago.
This information is essential for this collection, as our goal is to improve the accuracy of areas that could not be
traveled by Apple vehicles, including, but not limited to narrow streets, blocked streets, pedestrian corridors,
parks, pavilions, and public trails. Many of these components are also leveraged in other data collection efforts,
and as such there may be similarities.
Another sensor to aid the navigation data is the Inertial Measurement Unit (IMU). The data from this source is
combined with the GPS traces to give an accurate picture of how the system has moved over time.
There are many manufacturers of devices which collect this type of information. For example Maxon and
Vectornav distribute a system that is leveraged for such purposes in the industry field.
2D Still Imagery
2D still imagery is a highly effective tool used in map data production, which allows for data editors to append
key ‘attributes’ to the GPS trace information. Examples of attributes include, but are not limited to:
• Street and pathway geometries
• Street signs
• Transition zones (e.g., between pedestrian walkways and streets)
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
4 of 12
• Barriers
• Landcover (e.g., grass, dirt, concrete)
These are all details which enable the map database to become a truly accurate representation of pedestrian
areas as they correlate to the street network. This enables consumers to enjoy a reliable navigation experience
while changing from different modalities during a trip (for example, starting in a vehicle, then walking, then taking
public transit). Without such attributes correlated to the shape and position of these pedestrian areas, it becomes
exceedingly difficult to provide map display, search, routing and navigation experiences of benefit to the
consumer. In some respects, attributes are a form of metadata. Additionally, raw imagery will be used only to
improve the algorithms that blur faces and license plates in images published in Look Around feature. Only
imagery that has gone through the privacy blurring process described below will be used to develop and improve
other Apple products and services, which may include using data to train models, such as models related to
image recognition, creation, and enhancement. When imagery is used to improve products and services outside
of Apple Maps, Apple ensures images used from each specific location were taken at least six months apart in
order to maintain privacy.
LiDAR
LiDAR (Light Detection and Ranging) is a technology which is a parallel to RADAR, but rather than using radio
signals to detect shape and form, it uses pulses of light to detect shape and form. This technology is commonly
used by digital map makers across the industry field. For the purposes of Apple’s map data collection, LiDAR is
used to establish the height, width and depth of buildings and other structures for multi-dimensional
representation.
LiDAR’s primary functions are:
• Augmenting the position of ‘attributes’, relative to that of other objects.
• Bringing a rough outline of shape to the object being detected, in order to provide confirmation of the
object type should it be unclear.
• Providing specific dimensions for crucial attributes such as wall heights, structural divisions and
transitional areas.
The output of collected LiDAR data is often referred to as a ‘point cloud’ and can only be viewed through very
high end, production level software applications. LiDAR data is commonly used as an ingredient for building
map databases to enable products such as routing and navigation, and for providing three dimensional depth
to objects such as buildings.
The raw LiDAR data is primarily used to help assess the position of objects relative to each other or a different
fixed position. For example, the distance between two walls or the full length of a hallway. LiDAR is also used to
help reconstruct shapes of objects to ensure that the imagery is properly shaped and oriented.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
5 of 12
Most data is formatted by the number of points generated in a particular pulse or reading, some sample output
is:
• Point count: 3,249,019
• ZMin: 302.88
• ZMax: 713.54
• Scan Angle: -5, 21
Data Collection Systems
Backpack System
The backpack system is a frame constructed as a backpack which is worn by the surveyor. The system
captures data about its surroundings as the surveyor walks. This type of portable system has been used by
other mapping companies for some time.
The backpack camera array consists of 5-8 still cameras capturing still imagery (1 Frame Per Second) and 1- 3
LiDAR units (positioned on the side of the frame or under the cameras, collecting dimensional information to
include the distance between two walls or the full length of a hallway) and 1 positioning system (GNSS, IMU).
To capture accurate positioning, the sensor uses 2 GNSS antennas. To capture the imagery during the
collection, the system contains a computer which controls the flow of information throughout the backpack.
The entire system is powered by 2 batteries. In addition, an iPhone may be used to run diagnostics on the
collection system and to perform calibration.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
6 of 12
Security
Backpack System
The data collected by the backpack system is stored on a solid state drive (SSD). The state of the SSD drive is
fully tracked. When full, the SSD is shipped to Apple’s upload facility, copied onto servers, then securely
deleted, and shipped back to the field for re-use. Apple has followed this process for the vehicle data collection
program since 2015 and it has proved to be secure and successful.
The tracking system monitors the SSDs in multiple states, including:
• Idle/at upload facility
• In transit for use
• Deployed and not in use
• Deployed and in use
• In transit for data ingestion
• Data removed for ingestion
All collected data is encrypted to the SSDs using 128-bit AES with PKCS7. The private encryption key is only
held by Apple. It can be changed and a new key deployed within 24 hours.
All software used in the data collection process runs on Apple hardware and is also developed by Apple. It
provides monitoring, insight into the settings for the cameras and alerts the operator if there are errors in the
system.
Once on Apple premises the data follows standard Apple security protocols
• Data is only stored on physically and logically secured servers
• Access to the data is tightly controlled through Apple access control lists
Image Blurring
As outlined above, before the publication of any form of imagery, steps will be taken to blur faces and license
plates so that they are not identifiable in the published product. As our aim is to only capture objects that are
stationary, our first objective is to remove anything that is in motion. The goal of the product is to remove the
likeness of any person through face blurring. The second objective is to recognize any object we wish to
capture for use in our product. An example of this would be a store front or transitional area such as an
elevator or stairs. As such, Apple strives to remove identifiable information from pedestrian collection data by
blurring faces and license plates. For this, we train the software to recognize what a store front looks like. Upon
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
7 of 12
doing so, these areas are identified and processed as such. Apple uses quality assurance processes, both
automated and manual, to ensure the products are developed to Apple’s privacy standards. Once the imagery
is blurred, Apple may publish the imagery in Look Around and use it to develop and improve Apple Maps and
other Apple products and services, including training models related to image recognition, creation, and
enhancement.
Data Retention
For the purpose of producing a visual representation of streets and buildings, it is intended that all collected
imagery will be retained in its raw collected state for a period of 12 months after the date of collection. This is to
allow for the imagery to be consulted in its raw state while we continue to work on refining this product and our
means for ensuring fully effective blurring of imagery including during collection. Where requested by a data
protection authority, these periods are varied on a per country basis to ensure alignment with local
requirements.
The chosen period reflect the reality that elements of these products will continuously remain under active
development and refinement with a precise date of release of some elements to our customers yet to be
finalized. We release products to our customers when they meet the exacting quality standards that we set for
ourselves and which our customers expect. The proposed retention periods for the imagery reflect the reality
that it is our raw material for developing our product to the highest standards.
Query Handling
Apple has a dedicated team that handles queries or requests from the public. Apple responds to such queries
within a short period in accordance with GDPR and other applicable privacy laws. All individuals can contact us
via the contact form which is available from our Maps Data Collection website
http://maps.apple.com/imagecollection/. Requests received through our standard privacy contact form are also
handled.
In the event of publication of images, users who wish to report concerns can use our Report an Issue feature in
Apple Maps. This is our preferred method as it allows us to fully track requests and respond on the precise
image of concern which is submitted with the flow when used by a user. Non-Apple users can use the Report
an Issue feature using Maps on their web browser by going to maps.apple.com. Where a person does not wish
to pursue either of these options, Apple’s contact email address will remain available.
Requests for access to or deletion of raw imagery by an individual are processed by our team upon the
provision of appropriate identification of the location where an image was likely collected and the timing within a
15 minute time window which is solely to allow us to try to find the imagery in question.
Public Notice
For the backpack system collections, we update our Maps Data Collection website
http://maps.apple.com/imagecollection/ to inform individuals of the areas in which we will be collecting images.
This website contains a detailed map view of the streets where collection will take place. An example of this
map view is found in Appendix 3.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
8 of 12
Backpack systems are clearly marked with Apple Maps branding so that individuals with questions can raise
them. Individuals conducting these collections will wear Apple branding with a link to our Maps Data Collection
website.
Video
Apple does not collect video as part of the pedestrian data collections.
Image Publication
The primary goal of this map data collection is to improve the quality of the Apple Maps database. Apple may
publish the images collected from the backpack system in some form at a future date if the collection is
successful. Any publication will follow the privacy and security protocols described in this document, which
includes the blurring of faces and license plates.
Plans for Future Collections
Apple is planning future data collections using this methodology in accordance with the schedule found in
Appendix 3.
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
9 of 12
Appendix 1: Photographs of the current Backpack System
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
10 of 12
Appendix 2: Terminology
GPS (Global Positioning System) — a radio sensor which uses a number of satellites to determine the ground
position of an object. The components of this sensor are affixed to the back and top left of the collection
system.
IMU (Inertial Measurement Unit) — sensor package attached to the back frame of the system containing
accelerometers, gyroscopes, and a GPS receiver.
LiDAR (Light Detection and Ranging) — a technology which is a parallel to RADAR, but rather than using radio
signals to detect shape and form, it uses pulses of light to detect shape and form. For the purposes of Apple s
map data collection, LiDAR is used to establish the height, width and depth of buildings and other structures for
multi-dimensional representation.
Appendix 3:
Planned Collection Dates and Locations: Backpack System — Estonia
* the anticipated start date for each city may be adjusted due to unforeseen circumstances, such as weather
conditions.
Sample of Public Disclosure Map for Backpack System Collections:
The areas scheduled for collection with the backpack system will be disclosed to the public in a dynamic map
interface on https://maps.apple.com/imagecollection. The following are screenshots of the view presented to
users.
State or City Approx. Number of
Backpack Systems
Anticipated Start Date* Anticipated End Date
Tallinn 4 2026-04-27 2026-06-05 Tartu 4 2026-04-27 2026-06-03
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
11 of 12
Tallinn (City View)
Tallinn (Neighborhood View)
APPLE MAPS - STRICTLY CONFIDENTIAL MAP DATA COLLECTION METHODOLOGY
12 of 12
Tartu (City View)
Tartu (Neighborhood View)