With the increasing focus on information security across all sectors of government, IT policies are placing increased restrictions on information architectures, including GIS. While these restrictions may not prevent the development of a robust enterprise geospatial architecture, the approval and accreditation processes can introduce significant delays, during which work must continue. This is where workarounds come into play.
For example, while supporting a recent customer, we needed a way to mass update a series of static maps. After gathering initial requirements, we realized that the information on the maps would need to be updated many times per day over a period of several months. Senior leadership wanted to be able to see the most recent updates at any time. Due to network infrastructure limitations and restrictions, it was not possible take a standard approach such as setting up an intranet-based web server and serving dynamic maps. We needed a way to automate the entire process so that we could meet our reporting requirements without dedicating a full-time analyst to this single task.
The basics of the process were as follows: A team of users had a Microsoft Excel spreadsheet that they were using to update the information that needed to be mapped. This data needed to be uniquely identified so that it could be tied to a spatial dataset, and displayed within ESRI’s ArcMap. In our case we were using Unified School Districts from the US Census TIGER data, which identifies each district using a GEOID. This unique identifier was chosen interconnect the process. The Unified School Districts polygon dataset was downloaded, and a similar point dataset was produced from the polygon centroids. Next, the spreadsheet was modified slightly so that it would function as expected within ArcGIS for Desktop. This primarily involved adding the GEOID field and populating it, formatting columns for dates, etc. Now that the data was configured, it was time to develop the process that would tie the spatial data with the non-spatial, and update a map layer.
We developed a geoprocessing model that would walk through connecting to the Excel Worksheet, migrating it into a File Based Geodatabase, joining it to the point dataset, and calculating a field. The field calculation values would be used to show status information within the map layer.
Once the map template and geoprocessing workflow were complete, it was time to consider automation. Because of the status update frequency, we decided to script the update process using Python, and attach it to a Windows scheduled task. During this phase we also added code to export the map to KML, timestamp the exported PDF file name, and copy the files to two shared storage locations.
After the scripting was complete, the execution of the file was rolled into a batch file, and attached to a series of scheduled tasks. This allowed the process to read the originating Excel worksheet several times a day, and update the map with the latest information. So any time senior leadership wanted to see the status of the project, they could open the file sitting in a shared location.
This approach enabled our team to work with our users to produce up-to-date map products using available tools and working within the constraints of our information technology architecture.
This post was written by: