Loading...

Follow Spatial Times on Feedspot

Continue with Google
Continue with Facebook
or

Valid

We’ve all been there - Staring at the progress bar, wondering why task manager continues to say our 32-cores are sitting nearly idle while we run a geoprocessing tool on a huge dataset. OK, that huge dataset isn’t so large anymore - it’s the new normal. But why is ArcMap/Pro toying with us? We did the usual tweaks: data is local, fast drives, 64-bit BGP (for ArcMap), lots of free RAM. Still, the abnormally powerful “GIS workstation” computer (the one we had to beg for and write a justification document to get) is sitting there barely rotating the computer fans? Could be we aren’t taking advantage of the Parallel Processing Factor in ArcGIS.

Parallel Processing?

Parallel processing is a technique which splits up a task into many smaller chunks. The chunks are then assigned to multiple CPUs, cores, or processes to work on parts of the job at the same time. This technique can often result in faster processing times for larger datasets. For the past few releases, Esri has been increasing the number of tools that can take advantage of parallel processing. This allows some tools to distribute their work across multiple cores. Very handy if you have a good computer and want to take advantage of the extra power.

There are some limitations of course. First, not all tools currently take advantage of this. Over time the number of tools has been expanding, so worth reviewing your favourite tools after an upgrade. Second, you might not get the performance boost you’d expect due to the data or analysis you are trying to perform. If using ArcMap, make sure the tool is running in 64-bit mode via Background Geo-Processing as well (BGP).

64-bit and multithreading

Parallel processing is related to 64-bit and multithreading, but there are differences. For most geoprocessing tools, performance is roughly the same between ArcGIS Pro and ArcMap (if using BGP), while some tools in Pro are being updated to be faster/improved over time. 64-bit geoprocessing does not automatically make tools faster - but it does allow for more memory allocation during processing. 32-bit is limited to 2GB memory, while 64-bit doesn’t have this limitation.

64-bit geoprocessing is more robust, results will be more accurate, and processes that previously hung, crashed, or ran out of memory may be able to complete successfully ~ Esri

Likewise, multithreading doesn’t increase the performance of geoprocessing directly. However, with a decent computer, multithreading allows geoprocessing to take place in its own thread. Running independently, this allows other functions to continue on the main thread such as pan/zoom on a map, work with symbology, etc. So multithreading is still very useful if you want to work on other items in your map while the analysis works away on its own thread. Multithreading is used in both Pro and ArcMap (if using background processing in ArcMap).

ArcGIS Parallel processing

So how can we speed up the process? One possible way is to take advantage of the Parallel Processing Factor in ArcGIS. Again, not all tools are set up to take advantage of Parallel processing but Esri has mentioned that more tools will have this feature at every release. ArcGIS Pro 2.1 has 70 geoprocessing tools and ArcMap 10.6 has approximately 30 tools.

ArcGIS Parallel processing is managed by the Parallel Processing Factor environment variable. Some tools now default to parallel processing, while most tools still require you to set the parallel processing factor. The documentation does a good job at letting you know which ones, however you can always set the parameters just in case - or to override the default. In ArcGIS, the environment section will list this variable if the option is available. You can set this variable using the following options:

The Parallel Processing Factor is also available when using the ArcPy module with Python. This can be set the same as other variables at the top of your script if you want to leverage for all tools that accept it: arcpy.env.parallelProcessingFactor = "80%".

Final tips

Following the usual best practices, along with leveraging Parallel Processing should reduce the amount of time you spend watching the progress bar. However, don’t go overboard. If you specify more processes than you have available, you could negatively impact performance. One exception to that rule is when the analysis is I/O heavy or processing directly to an Enterprise database connection:

The Add Rasters to Mosaic Dataset tool is I/O bound when the mosaic dataset is stored in an enterprise database. Also, the Build Overviews tool is primarily I/O bound to the disk. You can use more processes than your machine has cores by either specifying a percent value greater than 100% or a number of processes greater than the number of cores on your machine. ~Esri

Helpful Links

https://en.wikipedia.org/wiki/Parallel_computing https://www.techopedia.com/definition/4598/parallel-processing https://pro.arcgis.com/en/pro-app/tool-reference/environment-settings/parallel-processing-factor.htm http://desktop.arcgis.com/en/arcmap/latest/tools/spatial-analyst-toolbox/parallel-processing-with-spatial-analyst.htm#ESRISECTION1BB63FA79C2A9411690A1AC958EECFC05

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In part 1 we went over the main components to create a JavaScript weather app with location data. From finding the device location, getting weather data from the internet machine (DarkSky API), and building a basic HTML interface. However, one limitation mentioned was how to keep the DarkSky “secret key” a secret.

In part 2 we will build a simple reverse proxy to hide our key on a server. A reverse proxy is when we redirect resources on behalf of the client from one or more servers. These resources are then returned to the client, appearing as if they originated from the proxy server itself. To create the proxy, we will use Node.js (server-side JavaScript). There are many other approaches, but Node.js allows us to use JavaScript and is one of the most powerful runtimes in use today. And who doesn’t love a little JavaScript!

The team at DarkSky also highly recommends you to protect your key. They feel strongly enough that they disabled cross-origin resource sharing (CORS) as a security precaution.

Your API call includes your secret API key as part of the request. If you were to make API calls from client-facing code, anyone could extract and use your API key, which would result in a bill that you’d have to pay. We disable CORS to help keep your API secret key a secret. To prevent API key abuse, you should set up a proxy server to make calls to our API behind the scenes. Then you can provide forecasts to your clients without exposing your API key. ~ DarkSky FAQ

Yeah, let’s do that… with JavaScript (Node.js)!

Node.js in a nutshell

Node is simply a runtime (environment) that allows us to run JavaScript outside a browser. With Node, we can build web servers, server APIs, web apps, desktop apps, and much more. When running JavaScript server side, you also don’t need to worry about Web Browser compatibility since the JavaScript engine is always the same - and built on Chrome’s JavaScript engine.

Node.js® is a JavaScript runtime built on Chrome’s V8 JavaScript engine. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient. ~ Node.js

Node.js also uses packages to provide additional functionality, much like modules in Python. Many are included by default, but there are thousands of other great external packages that can help when needed.

For more details on Node.js, there are lots of amazing resources out there. Check out the Node.js website, along with a large library of YouTube videos, and Medium articles should give you an in-depth look into this very popular runtime.

The Proxy

For the proxy, we are going to use 3 modules. We will add a few more optional ones later for some additional security and validation. Dot-Env is also listed here, but could also be considered optional. Dot-Env will keep the key outside of the main code (if you want to post your code to gitHub, but not your key), there are many other ways we could perform that step as well. Here are the main packages:

  • Express: Web application framework that provides a robust set of features for web and mobile applications.
  • Request: Request is designed to make HTTP calls. It supports HTTPS and follows redirects by default.
  • Dot-Env: A zero-dependency module that loads environment variables from a file. This stores the configuration in the environment separate from the code.
The Project Setup

We will use a very simple project structure for this work. A server-side JavaScript file (server.js), and a Dot-Env file (.env) to store our secret key.

The .env file will contain a single variable with the key. The variable can be named whatever you would like - however, no spaces, and all CAPS is recommended. Let’s set this up as “DARKSKYSECRET”. When we are ready to access this variable inside the server application, we just prefix the variable name “process.env.[variable]”. Based on our variable name above: `process.env.DARKSKYSECRET`.

//.env file
DARKSKY_SECRET="abc123"
The Code

To build the reverse proxy server we start by creating the server.js file and add some of the modules we will be using.

//server.js - load the modules
const express = require('express');  
const request = require('request');
require('dotenv').config();
const app = express();

With the modules loaded it’s time to define a route (API endpoint). This endpoint will take the request from the client web application along with URL parameters. Once we have the request on the server, we will add our key and send our own request to the DarkSky API from the server. The response from DarkSky will be handed back to the client.

In the client app, we will change our original code from Part 1 that talks to DarkSky directly - subbing in our server route /darksky/:lat,:long. Both the “:lat” and “:long” keywords are placeholders on the server that will be populated when the client makes a request. A sample request from the client would look like https://myDomain.com/darkSky/-80.123,45.123. The server will interrogate the request and if it matches route name, the request will be forwarded. To access our lat/long variables in the request, we prefix with req.params..

//server.js - setup the route
app.get('/darkSky/:lat,:long', (req,res) => {
    //DarkSky URL we will send the request to
    var dsURL = 'https://api.darksky.net/forecast/';
    //Define a variable pointing to our secret key
    var dsSecret = process.env.DARKSKY_SECRET;
    //Additional DarkSky  URL parameters (optional)
    var dsSettings = '?exclude=minutely,hourly,daily,alerts,flags&units=auto';
    //Build the full DarkSky URL
    var url = dsURL + dsSecret + '/' + req.params.lat + ',' + req.params.long + dsSettings;
    //Send the request and direct the results back to the user
    req.pipe(request(url)).pipe(res);
});
//app starts a server and listens on port 3000 for connections
app.listen(3000, () => console.log('server ready'));

If we were to stop here, this entire exercise wouldn’t be worth the effort. We’ve created a reverse proxy that hides the key, but it doesn’t stop anyone else from using our proxy/routes with external apps. So anyone could leverage the proxy - while DarkSky sends us the bill. We aren’t going to add a full authentication tier, but it’s something to consider for production sites.

Basic protection

With the key hidden, let’s focus on adding some server protection. More could/should be done, and we should consider the below example a bare minimum. To keep this painless, let’s add a few additional modules to do the heavy lifting for us.

  • Helmet: Helps secure your Express apps by setting/blocking various HTTP headers
  • host-Validation: Extends Express (middleware) that protects Node.js servers from DNS Rebinding attacks by validating Host and Referer headers from incoming requests
//server.js - add protection
const helmet = require('helmet');
const hostValidation = require('host-validation');
//configure hostValidation to only accept requests from a specific host () and referer
//  if the following hosts/referers don't match the client, the request will be rejected
app.use(hostValidation({ hosts: ['myDomain.com'], referers: ['https://myDomain.com/weather.html'] }));
//Use helmet's default header rules for added security
app.use(helmet());
Rate Limiting

Rate limiting is used to control the rate of traffic received by the server and is used to prevent DoS attacks. We aren’t going to throttle (slow) each request to our server, just make sure each client doesn’t abuse the server and make thousands of requests per minute (remember, we want to keep this API from spending our beer money). We can set both a timeframe and the maximum number of requests to allow within that timeframe - for each IP requesting a weather update.

//server.js - add rate limiting
const rateLimit = require("express-rate-limit");
//app.enable("trust proxy"); //if already behind a reverse proxy
const limiter = rateLimit({
  windowMs: 5 * 60 * 1000, // 5 minutes
  max: 20 // limit each IP to 20 requests per windowMs
});
app.use(limiter); //apply to all requests
Putting it all together

Here is all the code put together. In total, we will need server.js and the .env file in the root of the project along with all of the modules in the default node_modules folder.

//server.js
const express = require('express');  
const request = require('request');
require('dotenv').config();
const helmet = require('helmet');
const hostValidation = require('host-validation');
const rateLimit = require("express-rate-limit");
const app = express();
app.use(hostValidation({ hosts: ['spatialtimes.com'], referers: ['https://www.spatialtimes.com/weather.html'] }));
app.use(helmet());
const limiter = rateLimit({
  windowMs: 5 * 60 * 1000,
  max: 20
});
app.use(limiter);

//Setup route for DarkSky
app.get('/darkSky/:lat,:long', (req,res) => {
  var dsURL = 'https://api.darksky.net/forecast/';
  var dsSecret = process.env.DARKSKY_SECRET;
  var dsSettings = '?exclude=minutely,hourly,daily,alerts,flags&units=auto';
  var url = dsURL + dsSecret + '/' + req.params.lat + ',' + req.params.long + dsSettings;
  req.pipe(request(url)).pipe(res);
});

const port = 8001;
app.listen(port, () => console.log('server ready'));

Now that our server is ready with the reverse proxy we would just need to change the URL request in the HTML file to call our new endpoint instead of going to the DarkSky API directly.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Draft test

This is a draft post. Hopefully we can hide it.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In the previous post, we discussed the possibility of writing our own location-based weather application. So let’s try it! In part 1 we will build a simple app that gets the users’ location and fetches weather data from an API using HTML and JavaScript. This approach is similar to using various mapping APIs to find the user location and will take us a bit behind the scenes on how some of the pre-built location map widgets really work.

Here are the steps we will work through to build the app:

  1. Get user coordinates using the HTML5 GeoLocation API
  2. Request weather data for the coordinates using an online weather API
  3. (optional) Reverse Geocode the coordinates to get additional location details

    • Not required for most weather APIs, but worth touching on
Before we begin: The GeoLocation API

To request a user location (lat/long coordinates) within the web browser we will leverage the HTML5 GeoLocation API. This functionality is available in all major browsers (desktop and mobile) and includes IE - yeah, crazy I know. But just because it’s supported, doesn’t mean our code will always bring back a valid location. There are a few reasons why we might not get what we are looking for:

  1. When using the API in most browsers the GeoLocation results are only available over secure connections (HTTPS). So the final product should be hosted with HTTPS (we can enable this for free these days, so this shouldn’t be a show-stopper).

    • An exception to the HTTPS rule is local file:// testing which still works.
  2. The Geolocation API might be blocked by the user/settings. In most desktop/mobile operating systems users have the ability to block location services from being exposed to the browser. In addition, the user will be prompted to provide permission for an individual website as per W3C specifications.
  3. A request timeout can occur if the data takes too long to return. An optional PositionOptions object has a timeout setting representing the maximum length of time (in milliseconds) the device is allowed to wait for a return a position.

    • The default timeout value is Infinity, which is a long time to wait. It’s a good idea to set a reasonable timeout. Even slower response times should be expected if using this in combination with PositionOptions.enableHighAccuracy.

Note: As of Chrome 50, the Geolocation API will only work on secure contexts such as HTTPS. If your site is hosted on a non-secure origin (such as HTTP) the requests to get the users location will no longer function ~ w3schools.com

Thankfully the Geolocation API has some error checking available to provide some details when troubles occur, including timeouts. It’s then possible to add alternate location approaches such as IP Address lookup, prompt the user for a city, or to change their browser settings.

1. Get user coordinates using the HTML5 GeoLocation API

To start, we should determine if the browser has basic support for the Geolocation API. Again, even IE9+ supports this but still a good idea to test and very simple to implement. If for some reason there is no support, we can send the user a message letting them know their browser sucks.

//Check if the geolocation API exists
if (navigator.geolocation) {
  //true
  alert('Lets get the location (placeholder)');
} else {
  //false
  alert('geolocation not available?! What browser is this?');
  // prompt for city?
}

Assuming everything goes well we can swap our placeholder with some code to request the actual location coordinates. The GeoLocation API has three main methods:

  • getCurrentPosition() - used to return the user’s position. A single request per call. This is the one we will use for the example
  • watchPosition() - Returns the current position of the user and continues to return updated position as the user moves
  • clearWatch() - Stops the watchPosition() method

The getCurrentPosition and watchPosition methods are asynchronous, which means we create a callback to deal with the location data whenever it’s returned (or when an error is returned). A successful return can have up to 8 properties, however, only 3 of these are guaranteed: latitude, longitude, accuracy.

Here is a short sample with no error checks with an inline success callback so we can easily identify the basics:

//Short sample version with inline success callback
if (navigator.geolocation) {
    //Initial a request for the location
    navigator.geolocation.getCurrentPosition(function(pos){
      //'pos' return object has many properties we can grab
      var geoLat = pos.coords.latitude.toFixed(5);
      var geoLng = pos.coords.longitude.toFixed(5);
      var geoAcc = pos.coords.accuracy.toFixed(1);
    });
}

Here is a more complete sample with both the success and error callbacks in place. Error messages are broken out just in case we want to add more options in the future.

//More complete version
if (navigator.geolocation) {
    // Request the current position
    // If successful, call getPosSuccess; On error, call getPosErr
    navigator.geolocation.getCurrentPosition(getPosSuccess, getPosErr);
} else {
    alert('geolocation not available?! What year is this?');
    // IP address or prompt for city?
}

// getCurrentPosition: Successful return
function getPosSuccess(pos) {
  // Get the coordinates and accuracy properties from the returned object
  var geoLat = pos.coords.latitude.toFixed(5);
  var geoLng = pos.coords.longitude.toFixed(5);
  var geoAcc = pos.coords.accuracy.toFixed(1);
}

// getCurrentPosition: Error returned
function getPosErr(err) {
  switch (err.code) {
    case err.PERMISSION_DENIED:
      alert("User denied the request for Geolocation.");
      break;
    case err.POSITION_UNAVAILABLE:
      alert("Location information is unavailable.");
      break;
    case err.TIMEOUT:
      alert("The request to get user location timed out.");
      break;
    default:
      alert("An unknown error occurred.");
  }
}

That’s all it takes. If all goes well we now have the latitude and longitude coordinates. With this information available, we can to pass it to a weather API and/or convert the coordinates into a known location (reverse geocode).

2. Request weather data for the coordinates using an online weather API

There are many weather APIs available, and most of them have a free tier (for low usage and/or testing). Each will have some pros and cons, including various advanced features (15-day forecast, multiple forecast models, etc). One major change in this area is the recent shutdown of the very popular Yahoo! Weather API, probably one of the most used weather APIs over the past decade. As of January 3rd, 2019 the service is offline with a replacement service just starting its “by-request onboarding” phase. Since that’s currently not available, we are going to use DarkSky. This API has an easy to use free tier and accepts Lat/Long inputs. If using for production purposes it’s advised to review all of the APIs to find one that best fits your needs along with appropriate Terms of Use.

Side note on Weather APIs: Looks like this world is collapsing into more of an oligopoly. Weather Underground was purchased by The Weather Channel (2012), which was then acquired by IBM (finalized in 2017). Weather Underground was set to retire their API Dec 31, 2018, but have extended this to February 15, 2019, to allow more transition time.

Side note on Dark Sky: This is not a sponsored post and I have no affiliation/relationship with DarkSky. The use of their API is based on it being easy to use, good documentation, and a free usage tier.

Using DarkSky’s Weather API

The DarkSky developer API will require you to register an account to get access to a key. Once complete, a request is fairly simple:

<!-- API Params -->
https://api.darksky.net/forecast/[key]/[latitude],[longitude]
<!-- Sample Request -->
https://api.darksky.net/forecast/myFakeKey123abc/43.642567,-79.387054

The resulting JSON data returned from a request like this can be pretty substantial. We can reduce the response using query parameters (filters) to focus on what we want, and also set the measurement units (si, ca, us, uk2, auto). Now we can make a more streamlined request:

<!-- API Params for filters and units (auto) -->
https://api.darksky.net/forecast/[key]/[latitude],[longitude]?exclude=minutely,hourly,daily,alerts,flags&units=auto
<!-- Sample Request with filters and units -->
https://api.darksky.net/forecast/myFakeKey123abc/43.642567,-79.387054?exclude=minutely,hourly,daily,alerts,flags&units=auto

Here is the returned JSON object at the CN Tower in Toronto, Ontario, Canada (43.642567, -79.387054) with filters and units set:

{
   "latitude":43.642567,
   "longitude":-79.387054,
   "timezone":"America/Toronto",
   "currently":{
      "time":1546832805,
      "summary":"Partly Cloudy",
      "icon":"partly-cloudy-night",
      "nearestStormDistance":219,
      "nearestStormBearing":75,
      "precipIntensity":0,
      "precipProbability":0,
      "temperature":-5.11,
      "apparentTemperature":-10.56,
      "dewPoint":-13.61,
      "humidity":0.51,
      "pressure":1032.58,
      "windSpeed":14.42,
      "windGust":26.78,
      "windBearing":38,
      "cloudCover":0.59,
      "uvIndex":0,
      "visibility":16.09,
      "ozone":247.23
   },
   "offset":-5
}

Much easier to work with this smaller dataset, but it does remove some advanced attributes we might want to tap into later. For now, we sent a request based on user location and received the weather data. Time to build a basic GUI and grab a beer (who cares if it’s -5 celsius outside). While we let our beer warm up and breathe a little, we will wrap this request into JavaScript.

Building the request in JavaScript

With the DarkSky URL syntax ready to go, we just need to make the call in our code. Well, almost ready. There is one last hurdle: CORS. DarkSky has disabled Cross-Origin Resource Sharing (CORS) on their servers, and for good reason. As recommend by Dark Sky, you should leverage a secure proxy where you can store your Secret API Key (don’t put this in client-side JavaScript or bad people will do bad things with it).

Your API call includes your secret API key as part of the request. If you were to make API calls from client-facing code, anyone could extract and use your API key, which would result in a bill that you’d have to pay. We disable CORS to help keep your API secret key a secret ~ Dark Sky FAQ

Argghhh, the CORS limitation doesn’t help when building a JavaScript example. As a quick workaround, I’m going to leverage the “Heroku method” which will use the Heroku public CORS proxy and send the secret key directly from the client. THIS IS FOR TESTING ONLY so please don’t use this approach in your published code. Only a private proxy or create your own with your secret behind the scenes. It’s called a “secret” for a reason.

CORS limitations behind us, we just need to make our JavaScript request and use the results for our interface. For this, we are using jQuery’s getJSON() method. jQuery isn’t required here, it’s just a shorthand call to the XMLHttpRequest() constructor which you could use instead. I’ll place this in a separate function to keep things easy, and because you could put this entire function in your proxy if desired (just send geoLat and geoLong to the proxy to keep your secret key hidden).

jQuery’s getJSON() will request JSON-encoded data from the server using a GET HTTP request. In general, it takes a URL (like the weather ones we crafted earlier), and returns a Deferred object. In this case, it implements the Promise interface so we can provide multiple callbacks. A successful callback will route to .done(), while an error response will be directed to .fail().

_dsSecret = "yourSecret"; //Again, for testing only, should be hidden in proxy

function fn_getWeatherByLL(geoLat,geoLng){
  //API Variables
  var proxy = 'https://cors-anywhere.herokuapp.com/';
  var dsAPI = "https://api.darksky.net/forecast/";
  var dsKey = _dsSecret + "/";
  var dsParams = "?exclude=minutely,hourly,daily,alerts,flags&units=auto";
  //Concatenate API Variables into a URLRequest
  var URLRequest = proxy + dsAPI + dsKey + String(geoLat) + "," + String(geoLng) + dsParams
  //Make the jQuery.getJSON request
  $.getJSON( URLRequest )
    //Success promise
    .done(function( data ) {
      var wSummary = data.currently.summary;
      var wTemperature = data.currently.temperature;
      // lots of results available on the data object
      // use the results to populate the GUI here
    })
    //Error promise
    .fail(function() {
      alert('Sorry, something bad happened when retrieving the weather');
    }
  );
}
Have Weather > populate GUI

Once we have weather data for the location, we can populate the user interface. In the code above, the data object contains all the information needed to build your weather dashboard. From current temperature, daily max/mins, and short-term forecast - how you present this to the user is up to you.

The image above is based on an example JSFiddle adding results that I’ve already requested in advance using the methods above. The only thing needed is to update the HTML text of the IDs when the data object is returned in the promise.

Notice we don’t have a city/place name on the page, just latitude and longitude - Turns out this information isn’t available in the return object. If you want to find the place name of the current location, see the Reverse Geocoding section below.

Dark Sky Forecast Embeds

At this point, I should probably confess that Dark Sky also has a number of embeddable weather widgets. Don’t be mad, if I started with this information, we wouldn’t know what it takes to make it all happen - and possibly miss out on some JavaScript goodness. You’re welcome. Dark Sky has many ways to include their widgets if you can agree to their widget terms of service. These can be accessed directly from Dark Sky as scripts, an iframe (via Forecast.io), and from Weatherwidget.io.

If you just want a widget on your page, good news, no secret keys! Just get the location from the Geolocation API and create an iframe on the page dynamically.

function getPosSuccess(pos) {
  // Get the coordinates of the current possition.
  var geoLat = String(pos.coords.latitude.toFixed(5));
  var geoLng = String(pos.coords.longitude.toFixed(5));

  //Create an iframe and use the current location data
  var iSource = "https://forecast.io/embed/#lat=" + geoLat + "&lon=" + geoLng + "&name=Woot&color=#00aaff";
  $('<iframe>') // Creates the element
    .attr('src', iSource) // Sets the attribute spry:region="myDs"
    .attr('height', 245) // Set the height
    .attr('width', "100%") // Set the width
    .appendTo('#id-weather'); // Append to an existing element ID
}
Code Complete

That covers the basics of finding the device location, getting weather data from the internet machine (DarkSky API), and building a basic HTML interface. Now you can play with the weather data and determine how to enhance the interface.

This is really it for part 1. In part 2 we will build a proxy using Node.js (server-side JavaScript) to show one approach to dealing with secret keys.

Oh, I did mention we could reverse geocode the coordinates as well…

3. Reverse Geocode the coordinates to get additional location details (optional)

For weather data, we don’t need amazing accuracy because weather data doesn’t change at the street level. So even if the accuracy sucks, it should have little consequence. But if you want to find and show the current city, address, or include a pin on a map, reverse geocoding to the rescue. Using a Reverse Geocoding API is pretty much identical to using a weather API - Provide coordinates in a URL and wait for results.

Again, we have lots of API sources to reverse geocode a lat/long to an address: Esri search API, Google Maps/places API, Bing Maps API, and OpenStreetMap API to name just a few. A future article will explore these options in more detail. For now, let’s take a quick look at OSM Nominatim.

OSM Nominatim

OpenStreetMap “Nominatim” has a REST endpoint to reverse geocode a pair of coordinates. The wiki help does an excellent job getting into all the options here but it is actually handled the same way as DarkSky. Nominatim requires three parameters: format, latitude, and longitude. Using the format option in the example below, a JSON object is returned with lots of information.

<!-- Sample request -->
https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat=43.642567&lon=-79.387054
// return object
{  
   "place_id":"84050944",
   "licence":"Data © OpenStreetMap contributors, ODbL 1.0. https://osm.org/copyright",
   "osm_type":"way",
   "osm_id":"32742038",
   "lat":"43.6425637",
   "lon":"-79.3870871832047",
   "place_rank":"30",
   "category":"tourism",
   "type":"attraction",
   "importance":"0.466188092284943",
   "addresstype":"tourism",
   "name":"CN Tower",
   "display_name":"CN Tower, 301, Front Street West, Entertainment District, Old Toronto, Toronto, Ontario, M5V 2X3, Canada",
   "address":{  
      "attraction":"CN Tower",
      "house_number":"301",
      "road":"Front Street West",
      "neighbourhood":"Entertainment District",
      "city_district":"Old Toronto",
      "city":"Toronto",
      "state":"Ontario",
      "postcode":"M5V 2X3",
      "country":"Canada",
      "country_code":"ca"
   },
   "boundingbox":[  
      "43.6423338",
      "43.6427965",
      "-79.3874416",
      "-79.3867985"
   ]
}

This provides lots of location information. Worth checking in various rural/urban areas to see how the results differ, but overall this gives a good idea of what to expect.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

For some reason, my news feeds have recently been filled with articles about Weather Apps and how many of them are selling our location data to the highest bidder. After reading a few, I started to look at my weather apps (yes, I have a few) in more detail. I also noticed some now missing in my last OS update - namely Yahoo! Weather. Miss you already.

Location data in the news

Location-based Weather apps are everywhere but lately, they’ve been in the news for the wrong reasons. Turns out, many App Developers are taking our location data, even if we say not to, and selling it to third parties. If we grab a “free” app, we should assume that most of the time the App Developer still wants to make money. How do they make bank? By selling the data they collect. Location data among it. So why do weather apps get stuck holding the headlines as they surely aren’t the only ones doing this (Facebook)? Well, a few reasons off the top of my head why weather apps are at the forefront:

  1. There are so many of them:

    • Website 42Matters stats reveal iOS having 4,951 free weather apps listed for iOS (on 2019-01-07)
  2. Weather apps require the user to share location for them to work fully (might still work otherwise, but we tend to share our location for these apps)
  3. They are used daily, so lots of ongoing data gathering
  4. Weather apps are pretty easy to create
  5. Some smart people looked into this and yikes, it scary:

Location, Location, Location

Location data - there used to be only a small niche working with this data - the GIS Geek. We would go to conferences and repeat one of our favourite truisms: “80% of all data is spatial”. Sure, we have a hard time proving where that stat came from, but if anything, in today’s world we are probably underestimating.

These days, no matter what you are doing with technology, there is a chance someone is getting the tracking data of where you are while doing it. Why not?! Location analytics has also come a long way with big data, AI, and demographic profiling tools at our fingertips. As someone who’s worked in GIS for many years, I’m not shocked this is happening, more shocked people now know more about my chosen profession.

Technology is Location Aware

With phones and watches having embedded GNSS chips, and in an always-connected world where we can even trace an IP address to a city block - location data is proliferating. Every site we sign-up with can get our IP address, and probably ask us for our home address or zip/postal. Companies see the value in this data and are finding more ways to collect and leverage it. This is no different than the old Nielsen ratings that started back in 1947. Companies have always wanted to know how they were doing and to know more about who their audience is since this can help sell ads and better understand the customer. The data itself is also worth a pretty penny. Combining location and usage data with census information and profiles - can give a very detailed picture.

How To Protect Yourself

To be honest, most people are probably somewhat aware this is happening but don’t care. Privacy seems like a thing of the past in the digital world. We may even trust some companies with our data. We also assume that the data is secure, and if shared/sold, we pretend they are secure too. Really?! In Canada, there is a federal privacy law (PIPEDA) setting rules for acquiring “meaningful consent” if a company wants to share your personal information. After reading a few Terms of Use, pretty sure some are even in violation of federal laws in their sharing of data with third parties. Especially if they track your location without permission (you can still block this using your os privacy settings).

Time and time again we hear about another database being hacked. That data we were comfortable sharing initially is now in the hands of the unknown. Still, some might not care (if it doesn’t include their favourite/only password). From a GIS perspective, just think about what you could do with this location data. Hopefully, criminals don’t have a GIS background, or they could do some pretty crazy stuff - beyond understanding their “potential customers” better. Perhaps even a choropleth map.

So think about the vast mountain of data you are sharing with apps and in what combination. Then put on your hacker/GIS hat and think about what you could do with a dataset of 500 million records (Marriott).

How To Protect Others

Education and implementation. We should remind ourselves and others that our data is ours, and has value - so worth guarding. Use a password manager or at least make sure to use unique passwords that aren’t easy to guess. Don’t use the same password for your email account (Gmail, etc) on any other website where you use the same email to register. Only share location data if you think it is worth it and adds value to your use of the app.

Most importantly, there are lots of great articles out there on how to secure your digital self. A good time to explore a few. Then delete some of the 4,951 weather apps from your devices.

Where do we go from here?

Maybe it’s time to try writing our own location-based weather app?! Hmmm, stay tuned…

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Spatial Times is getting an upgrade. OK, more of a rebuild from scratch. After 5 years, it was time to update the interface and clean-up some server code. There will be some growing pains over the coming weeks, but it will be worth it (he said with trepidation). The biggest changes are behind the scenes, as the site is moving from a sluggish Wordpress site to a newer Static Site Generated (SSG) approach leveraging React and GraphQL. Selfishly, this also reduces my hosting cost which isn’t a bad thing either.

Static Site Generated with Gatsby

After lots of trial and error, I decided to go with Gatsby to build the new version of the site. This gave me the opportunity to continue my dive into modern development approaches with React.js, Webpack, GraphQL, and TypeScript (honourable mention to NPM, git, Node.js and VS Code of course). Over time I will add more Progressive Web App (PWA) functionality for better offline experience - but one step at a time.

All of the posts have been converted to markdown files as I embrace the JAMstack way of life. With all these changes, some things are bound to break. But if I miss something today, I will hopefully fix it tomorrow. Although if you see something wonky that persists for a while, let me know.

Why Upgrade Spatial Times?

Beyond what’s already been said, the upgrade lends itself to a learning opportunity. As most of the posts are GIS related, this learning isn’t limited to Spatial Times. GIS and the web are constantly evolving. Companies like Google, Esri, and Microsoft are all moving in this direction: Esri recently added the JavaScript API to NPM for creating custom builds; Google is focused on making PWA’s the future of web/app development; Microsoft bought GitHub (and big contributor), makes VS Code freely available (best code editor IMHO and built on Node/Electron), and Edge is moving to Chromium engine.

And there’s performance. Below is an audit report from Lighthouse to the original Spatial Times site using Wordpress. You can see the site was pretty good in the SEO and Best Practices categories, has decent accessibility, and dismal performance. To be fair, I didn’t spend too much time trying to squeeze performance out of the site - but shouldn’t it be fast by default?!

lighthouse audit of the old site

Without any tweaking of the starting template, the new site is getting a more favourable audit. The new site has a drastic improvement in many areas with Performance and Accessibility being my primary focus. I will spend some time making updates for PWA and SEO, but not going to lose any sleep with these new results.

lighthouse audit of the new site

A Few Changes

I’ve been working to keep the links to articles the same as to not impact any existing bookmarks or hyperlinks to the site. There are some changes as well:

  • The site is moving to HTTPS. Seems simple enough, but this means the entire site URL is changing. Even with post slugs remaining the same, the root will be different. All inter-site links are updated, but external sites may have an issue. I’m working on a redirect now but might take some time.
  • The Search widget has been removed. Let’s be honest: almost nobody used the search feature anyway. Search engines (Google, Bing, Duck-Duck-Go) are how we all search these days. They even have features to search within a specific site. Not worth adding to the competition at this point.
  • Post content has streamlined styling. Markdown files keep it simple when writing a new post. Custom CSS is possible, but not needed. This is a good thing as sometimes too much time is spent finding a custom style to highlight a point. Way too much time.
What’s Next

Hopefully everything goes as planned, but I’ll let the dust settle before making more changes. Maybe even find some time to publish some posts that have been sitting in the attic.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

A major change to ArcGIS Online User Levels are coming December 4th, 2018. This is a welcome change to the current “Level 1” and “Level 2” setup, but it will come with some growing pains. These User Levels will be converted to the newer ‘User Types’, and each User Type will have a mix of capabilities and Apps available.

What’s changing?

With the December release of ArcGIS Online, all existing Level 1 and Level 2 users will be changed to new User Types, along with 3 additional User Types being added. The new names are: ArcView and ArcIn… just kidding! For existing accounts, this is more of a re-branding since nobody wanted to be Level 3 with a sibling that was Level 4 (that is dating a Level 1 that used to be a level 5). Drama.

The bigger change here is the introduction of a new “GIS Professional” type. It’s this new type that has access to ArcGIS Pro, and isn’t directly related to the current Level 2 account. Maybe a chart and picture will help:

Compare old and new User Types

New User Types (via Esri)

Comparing User Types

User Types will be the new way to setup users in your organization based on the Capabilities and Apps required to do their work. Some of these new User Types will also have additional add-ons that can be configured to enhance the standard set of apps available.

The primary capabilities for each User Type is an expansion on the Level 1 and Level 2 users of today. I’ve taken the list from Esri and removed the App bundles to focus on the primary capabilities, but I strongly encourage reviewing their documentation for full details.

AGOL Capabilities

The Apps provided for each level are grouped together in named bundles. This is because we didn’t have enough new terminology at this point (and much easier to refer to a long list of apps if Esri bundle’s them). Either way, here is the breakdown with a selection of pastel colours I was able to find in Excel:

Apps by bundle and user type

Wondering where some of the niche Apps went? Not to fear - Navigator for ArcGIS, ArcGIS Business Analyst, Insights, Drone2Map, and Maps for Adobe Creative are all still available to add to your subscription.

Mark your calendars!

More information
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This post will discuss how to use ArcPy to Connect to SDE, create a new version, and switch to that version all inside a Python script. At first glance this should be pretty straight forward - provide some connection details and a version name. Well, ArcPy isn’t really designed for that approach. There currently isn’t an ArcPy function to connect to SDE within code, or to deal with new/changing versions. The current documented approach is to use Toolbox tools or create the version in advance, create a connection file pointing to a version, and use this as a reference in a script.

There are many reasons why the documented approach might not meet our needs: We want everything inside one script without any related files required; We don’t know the new version name in advance; We want to run on Linux (ArcGIS server); Or we simply just want the option to use a single script for our scripty stuff.

Creating a SDE connection file in ArcPy

At this point, it might be easier to open ArcCatalog, create the SDE connection file, and point your script to it. But if you want the “everything we need is in this script” approach, how far can we go? Well, not too far unless we are willing to use some Toolbox tools along the way.

Creating the SDE connection file isn’t a problem since there is a toolbox tool to create one. Inside our Python script we can call the CreateDatabaseConnection_management tool directly and save the connection file for later use.

CreateDatabaseConnection_management(out_folder_path, out_name, 
    database_platform, instance, {account_authentication},
    {username}, {password}, {save_user_pass}, {database},
    {schema}, {version_type}, {version}, {date})

We are going to use Windows Authentication so we don’t need to expose any user credentials for now. When creating the connection, we need to supply a few parameters that impact our script:

  1. ‘version’: The name we specify here must already exist. In this example, we will use the Default version.
  2. An output folder path (directory) for the file that is created. If on a server, there is a chance we aren’t sure where to save, or the script directory is read-only, and saving to c:\temp makes some admins a little nervous. Hmmmm, where to put the file?

What if we used the ArcPy “inMemory” space for the output? This would meet our needs, but inMemory doesn’t currently support this technique. Thankfully there is a great module called “tempfile” that comes included with core Python to handle this much the same as inMemory. The tempfile module generates temporary files and directories. It also works on all Python supported platforms.

tempfile.mkdtemp([suffix=''[, prefix='tmp'[, dir=None]]])

Within the tempfile module, there are commands for both directories and files that we can use much like the inMemory feature. The directory will be secure and accessible by the account running the script, just make sure to cleanup the temp directory when you are done.

sdeTempPath = tempfile.mkdtemp()
arcpy.CreateDatabaseConnection_management(sdeTempPath,
  'ConnName.sde','SQL_SERVER','db\\instance',
  'OPERATING_SYSTEM_AUTH','#', '#', '#','Database')

Creates a temporary directory in the most secure manner possible. There are no race conditions in the directory’s creation. The directory is readable, writable, and searchable only by the creating user ID. The user of mkdtemp() is responsible for deleting the temporary directory and its contents when done with it. ~Python Software Foundation, Overview of Tempfile

Create a new version in ArcPy

Toolbox is again required to create a version, using the CreateVersion_management tool.

CreateVersion_management (in_workspace, parent_version,
  version_name, {access_permission})

When creating a version in a DB like SQL Server, it will prefix the version name with the name of the user - even though we didn’t supply ours directly, it was determined when we created the connection file with reference to the current Windows Account. So before we can use the version, we will need to find it. Side note: We are assuming the version doesn’t already exist - you might want to check before creating.

arcpy.CreateVersion_management(sdeTempPath + os.sep +
  'ConnName.sde', 'sde.DEFAULT', 
  sdeVersionName, 'PUBLIC')
Using the new version

To continue complicating things, we can’t just switch our newly created version in the connection we already created. The only way to reference a version in ArcPy is to use a connection file that already points to that version. A little redundant, but good news is that we just learned how to create a SDE connection file, so we just need to find the full version name (user.version) and create a connection one more time with the new version name included.

The Code

Here is a script that will use ArcPy to Connect to SDE, create a new version, and switch to that version all inside a Python (ArcPy) script. This is just a sample and it is recommended to add more error handling, trap for correct licenses, check if the version exists, etc.

#Name: Python ArcPy Connect to SDE and versions
#Author: Bryan McIntosh
#Description: An approach to connect to SDE, create a new version, and change
# to the new version - inside a python script.
import arcpy, datetime, os, tempfile, shutil
#Create a string of the date to append to version name (to keep unique)
now = datetime.datetime.now()
strDate = str(now.year) + str(now.month) + str(now.day) + str(now.hour) + str(now.minute) + str(now.second)
#Create a temp directory using tempfile module to store SDE connection files
sdeTempPath = tempfile.mkdtemp()
#Setup first SDE Connection
arcpy.CreateDatabaseConnection_management(sdeTempPath,'ConnName.sde','SQL_SERVER','db\\instance','OPERATING_SYSTEM_AUTH','#', '#', '#','DBName')
##Create a new version (default as parent version)
sdeVersionName = 'MyVersion_' + strDate
arcpy.CreateVersion_management(sdeTempPath + os.sep + 'ConnName.sde', 'prefixName.DEFAULT', sdeVersionName, 'PUBLIC')
#The prefix of the version name isn't known (based on user), so need to find it so we can connect later
sdeVersionNameFULL = ''
for version in arcpy.da.ListVersions(sdeTempPath + os.sep + 'ConnName.sde'):
if version.name.split('.')[1] == sdeVersionName and not version.children:
sdeVersionNameFULL = version.name.split('.')[0] + '.' + sdeVersionName
arcpy.AddMessage('Version verified: ' + sdeVersionNameFULL)
##Create new connection file pointing to the new version
arcpy.CreateDatabaseConnection_management(sdeTempPath,'ConnVersion.sde','SQL_SERVER','db\\instance','OPERATING_SYSTEM_AUTH','#', '#', '#','DBName','#','#', sdeVersionNameFULL)
##DO STUFF WITH THE NEW VERSION
##DO STUFF WITH THE NEW VERSION
##When done, Remove the temp path containing connection files
shutil.rmtree(sdeTempPath)
References
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Collector for ArcGIS v18.0.1 is now released for all supported platforms. For iOS users, this includes the use of Leica’s Zeno GG04 plus. All platforms get an enhancement for offline tile usage security, and many bug fixes.

The bug fixes for iOS include many of the minor, yet annoying, issues that create some confusion with field staff:

  • BUG-000091922: After searching for a feature in Collector for ArcGIS (iOS) on an iPad Air, there is no option to Copy, Edit, Delete, Zoom to, or Get Directions to the feature
  • BUG-000092762: Orphaned replicas are seen under the REST endpoint of a hosted feature service after removing a map from a mobile device via the Collector for ArcGIS (iOS 8.x) application without synchronizing the edits
  • BUG-000093776: Collector for ArcGIS (iOS) shows duplicate results if multiple searches and selections are executed for the same feature using the Find Locations By Layer application setting
  • BUG-000095584: Unable to add attributes to a related table after searching the feature with a hosted feature layer in Collector for ArcGIS (iOS)
  • BUG-000109015: After searching for a feature in Collector for ArcGIS (iOS), updating a field that controls a feature’s symbology from these search results causes the symbology of the feature not to update, even though the field is updated.
  • BUG-000109894: Collector for ArcGIS (iOS) location tracking stops when the basemap is changed.

Make sure to update all your devices, and have fun in the field!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It was finally time to get bundled up and test the SXBlue Platinum in nature. Not only to test the functionality, but with the recent -20c outside, to test the device in a traditional Canadian winter. Before heading out, it was time to setup the device with ArcGIS Collector (v17.0.4). Just in case, the device comes with a great quick start guide to explain the status lights, and how to pair with iOS and Android. It goes even further and helps you setup the SXBlue Platinum with ArcGIS Collector which was a great reminder on the “Provider” adjustments that may be needed.

SXBlue Platinum and ArcGIS Collector Setup

There are many ways to use the SXBlue Platinum with ArcGIS Collector. The basic setup is to configure the Provider to SXBlue and leave the Location Profile defaults (AKA, do nothing). This is the one to use when the device is reading SV’s from GPS/GLONASS, etc. without any additional augmentation/corrections. Here in Ontario, there is a very likely chance the device will pickup SBAS (WAAS). With this bonus augmentation, there is a setting change to the Location Profile to get the optimal config in Collector (you need to tell Collector about the signal). Don’t worry, the Quick Start guide covers that too!

By checking the SXBlue status indicators or the status screen in Collector, you can determine if you are currently connected to SBAS.

SXBlue Platinum - SBAS for all!

Update Note Feb 08, 2018: The items in the image to help verify SBAS are the “Fix Type”, “Station ID”, and “Correction Age”. The Location Profile name “SBAS” is my own naming of a profile and doesn’t reflect the current WAAS status.

If you are using additional correction services for increased accuracy such as Atlas, CAN-Net, Top-Net, etc - you will also need to change the Location Profile to match the signal from the correction service.

Vertical Elevation

ArcGIS Collector doesn’t currently support adding a GEOID for Orthometric height. The SXBlue Platinum, like every GNSS device, will output Ellipsoidal elevations directly to the application it’s paired with. The device doesn’t store GEOIDs inside (not unlimited space), but does work with Apps/Software that do support GEOID files. So no matter what you are collecting in ArcGIS Collector, all vertical data is being stored in Ellipsoidal height. If you really want to store Orthometric elevation values, the SXBlue Platinum does work with other iOS/Android apps that have this ability. In short, this is a limitation of ArcGIS Collector, not the SXBlue Platinum. Maybe a future release from Esri will incorporate this feature - although you can always post process.

In the Field

My preference it to mount the antenna to a pole or a small extender on a backpack depending on accuracy needs. These approaches give the antenna 360 degrees of SV potential. In collector you can set a mounting height in the Provider setup screen. With the DGPS light glowing and Collector Status showing that I have WAAS, I can see that the Blue circle on the map is nice and tiny - a good sign of high accuracy beyond the status window. At this point, the accuracy was… 49cm (1.6 ft). Reviewing with some controls, accuracy was definitely up there. So overall, the SXBlue Platinum is sending data at the level of accuracy we would hope for.

As mentioned in the SXBlue Platinum Introduction, the RTK package comes with 1Hz by default. This is great for any foot powered mission. And if you want to upgrade to 10Hz or 20Hz and mount to a race car, those options are still unlock-able at any time. As I walked through the trails and on the sidewalk I would keep my eye on the status window - nothing much to report as I had strong accuracy readings all afternoon. Every once in a while I would switch to the SXBlue app for a more detailed look at what the device was doing.

iSXBlue RTN App for iOS

Lots of SVs, strong signals: so boring - and boring is great. I tried picking up my pace and running (relatively speaking) through a few sections and the SXBlue was doing a great job updating the paired Collector app.

Overall, the SXBlue Platinum did a great job - and this is without some of the more advanced features like an additional correction services. In reflection, it was just easy, worked as expected, and exceeded my accuracy expectations (full report on that another time). All good stuff when getting out into the elements!

More Info
Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview