Professional Documents
Culture Documents
A Gentle Introduction To Actinia
A Gentle Introduction To Actinia
r
» A gentle introduction to actinia
k
M
e
O
n
G
itL
A gentle introduction to actinia
ab
Author: Markus Neteler, mundialis GmbH & Co. KG, Bonn
Note: A fork of this workshop (which focuses on ace - the "actinia command execution") is
available at https://mmacata.github.io/actinia-introduction/ which comes with a shifted focus on
"bare" HTTP API from actinia and extended excercises.
Abstract
Actinia (https://actinia.mundialis.de/) is an open source REST API for
scalable, distributed, high performance processing of geographical
data that uses mainly GRASS GIS for computational tasks. Core
functionality includes the processing of single scenes and time series
of satellite images, of raster and vector data. With the existing (e.g.
Landsat) and Copernicus Sentinel big geodata pools which are
growing day by day, actinia is designed to follow the paradigm of
bringing algorithms to the cloud stored geodata. Actinia is an OSGeo Community Project since
2019.
In this course we will briefly give a short introduction to REST API and cloud processing concepts.
This is followed by an introduction to actinia processing along with hands-on to get more familiar
with the topic by means of exercises.
Fo
GRASS GIS (download)
rk
M
three additional Python packages:
e
O
Linux: pip3 install click requests
n
simplejson
G
itL
Windows users (Installer: OSGeo4W > Advanced installation > Search window):
ab
python3-click, python3-requests, python3-simplejson
ace - actinia command execution (to be run from a GRASS GIS session; installation shown
below)
jq, a lightweight and flexible command-line JSON processor
nice to have: QGIS
Note: We will use the demo actinia server at https://actinia.mundialis.de/ - hence Internet
connection is required.
Warming up
Introduction
Why cloud computing?
Overview actinia
Fo
kr
M
e
To make you familiar with a few concepts, let's take a look at the "graphical intro to actinia" -
O
n
GRASS GIS in the cloud: actinia geoprocessing (note: it requires the Chrome/ium browser;
G
itL
presentation provided by Carmen Tawalika, mundialis).
ab
Introduction
For this tutorial we assume working knowledge concerning geospatial analysis and Earth
observation (i.e., geodata such as raster, vector, time series, and image data including aerial, drone,
and satellite data). The tutorial includes, however, a brief introduction to REST (Representational
State Transfer) API basics and cloud processing related hints.
With the tremendous increase of available geospatial and Earth observation lately driven by the
European Copernicus programme (esp. the Sentinel satellites) and the increasing availability of open
data, the need for computational resources is growing in a non-linear way.
(Ideally) enjoy the five V’s of big data: Volume, velocity, variety, veracity and value.
Fo
kr
M
Overview actinia
e
O
n
G
itL
Actinia (https://actinia.mundialis.de/) is an open source REST API for scalable, distributed, high
ab
performance processing of geospatial and Earth observation data that uses mainly GRASS GIS for
computational tasks. Core functionality includes the processing of single scenes and time series of
satellite images, of raster and vector data. With the existing (e.g. Landsat) and Copernicus Sentinel
big geodata pools which are growing day by day, actinia is designed to follow the paradigm of
bringing algorithms to the cloud stored geodata. Actinia is an OSGeo Community Project since
2019. The source code is available on GitHub at https://github.com/mundialis/actinia_core. It is
written in Python and uses Flask, Redis, and other components.
Components of actinia
Core system:
actinia-core: an open source REST API for scalable, distributed and, high performance
processing of geographical data that uses mainly GRASS GIS for computational tasks. It can be
installed as is or enhanced with multiple plugins.
API:
Plugins:
Related:
openeo-grassgis-driver: OpenEO driver for GRASS GIS/actinia. Backend description at
Fo
https://openeo.mundialis.de/.well-known/openeo
r
k
M
e
O
n
G
itL
ab
Fig. 1: Components of actinia (core and plugins)
Actinia is not only a REST interface to GRASS GIS, but it offers the possibility to extend its
functionality with other software (ESA SNAP, GDAL, ...). To integrate other than GRASS GIS
software, a wrapper script is to be written (style: as a GRASS GIS Addon Python script) which then
includes the respective function calls of the software to be integrated. Calling shell commands in an
actinia process chain is also possible but limited due to security risks.
Persistent storage refers to a data store that retains data even in the event of a power-off, as well
as retaining it without a scheduled deletion time. In the geo/EO context, persistent storage is used
to provide, for example, base cartography, i.e. elevation models, road networks, building footprints,
etc.
The ephemeral storage is used for on demand computed results including user generated data and
temporary data as occurring in processing chains. In an ephemeral storage, data are only kept for a
limited period of time (e.g., in actinia, for 24 hs by default).
In the cloud computing context these differences are relevant as cost incurs when storing data.
Accordingly, actinia offers two modes of operation: persistent and ephemeral processing. In
particular, the actinia server is typically deployed on a server with access to a persistent GRASS GIS
database (PDB) and optionally to one or more GRASS GIS user databases (UDB).
Actinia is deployed multiple times as so called actinia nodes (separate physically distinct machines)
where the actual computations are performed. They can be deployed with the help of cloud
technology like e.g. kubernetes, openshift and docker-swarm. This technology then acts as a load
balancer, distributing jobs to actinia nodes. Results are either stored in GRASS UDBs in GRASS
native format or directly exported to a different data format (see Fig. 2).
Fo
kr
M
e
O
n
G
itL
ab
Fig. 2: Persistent and ephemeral storage with actinia nodes (source: mundialis FOSS4G talk 2019)
Architecture of actinia
Several components play a role in a cloud deployment of actinia (for an example, see Fig. 3):
analytics: here are the workers of GRASS GIS or wrapped other software (GDAL, ESA-SNAP, ...),
external data sources: import providers for various external data sources,
interface layer:
most importantly, the REST API,
openEO GRASS GIS driver,
ace - actinia command execution (to be run in a GRASS GIS session),
Fo
r k
M
In short, deployment means starting software, usually in an automated way on one or more
e
O
n
computer nodes. There are a number of technologies for this. In particular, virtualisation plays an
G
itL
important role here, which avoids a high dependency on hardware and software characteristics
ab
through abstraction.
An aim is to operate Infrastructure as Code (IaC), i.e. to have a set of scripts which order the
needed computational resources in the cloud, setup the network and storage topology, connect to
the nodes, install them with the needed software (usually docker based, i.e. so-called containers are
launched from prepared images) and processing chains. Basically, the entire software part of a cloud
computing infrastructure is launched "simply" through scripts with the advantage of restarting it
easily as needed, maintain it and migrate to other hardware.
With respect to actinia, various ways of deployment are offered: local installation, docker, docker-
compose, docker-swarm, Openshift, and kubernetes.
In detail, a REST API uses URL arguments to specify what information shall be returned through the
API. This is not much different from requesting a Web page in a browser, but through the REST API
we can execute commands remotely and retrieve the results.
Each URL is called a request while the data sent back to the user is called a response, after some
processing was performed.
There are two types of request: synchronous and asynchronous. In the case of a synchronous
request, the client sends it to the server and waits for a response. In geospatial computing,
processing can take some time, which would block the client because it is only waiting. In default
configurations the communication is canceled by the client after some minutes, called a "timeout".
To avoid this, there is also the asynchronous request type. Here the client does not wait directly for
a response, but checks from time to time whether the job has been completed (by "polling"), or by
providing an API itself which will be informed when the job is finished ("webhook").
the endpoint
the header
Fo
the data (or body)
rk
M
the methods
e
O
n
G
Endpoint
itL
ab
In general, an endpoint is an entry point to a service, a process, or a queue or topic destination in
service-oriented architectures. In the case of actinia, it may be a data query function, the
computation of a vegetation index, the deletion of a dataset, and more. Effectively, an endpoint is
the URL you request for. It follows this structure: https://api.some.server/endpoint. The final part of
an endpoint are the query parameters. Using query parameters you can modify your request with
key-value pairs, beginning with a question mark ( ? ). With an ampersand ( & ) each parameter pair
is separated, e.g.:
?query1=value1&query2=value2
As an example, we check the repositories of a GitHub user, in sorted form, using the repos
endpoint + query:
https://api.github.com/users/neteler/repos?sort=pushed
Both requests and responses have two parts: a header, and optionally a body
Header information contain e.g. authentication
In both requests and responses, the body contains the actual data being transmitted
The request body is only necessary for certain HTTP methods (e.g. HTTP POST) and can contain
any form of data, e.g. an actinia process chain
The response body returns information or results. Examples in actinia are json data or GeoTIFF
results
Methods
In REST APIs, every request has an HTTP method type associated with it.
The most common HTTP methods (or verbs) include:
GET - a GET request asks to receive a copy of a resource
POST - a POST request sends data to a server in order to replace existing or create new
resources
PUT - a PUT request sends data to a server in order to change or update resources
Response codes
HTTP responses don't have methods, but they do have status codes: HTTP status codes are
included in the header of every response in a REST API. Status codes include information about
the result of the original request.
Fo
Selected status codes (see also https://httpstatuses.com):
rk
M
e
200 - OK | All fine
O
n
G
404 - Not Found | The requested resource was not found
itL
ab
401 - Unauthorized | The request was rejected, as the sender is not (or wrongly) authorized
500 - Internal Server Error | Ouch, something went wrong while the server was processing
your request
JSON is a structured, machine readable format (while also human readable; in contrast to XML, at
least for many people). JSON is short for JavaScript Object Notation. For example, this command
line call...
{
"module": "v.buffer",
"id": "v.buffer_1804289383",
"inputs":[
{"param": "input", "value": "roadlines"},
{"param": "layer", "value": "-1"},
{"param": "type", "value": "point,line,area"},
{"param": "distance", "value": "10"},
{"param": "angle", "value": "0"},
{"param": "scale", "value": "1.0"}
],
"outputs":[
{"param": "output", "value": "roadbuf10"}
]
}
Hint: When writing JSON files, some linting (process of checking the source code for programmatic
as well as stylistic errors) might come handy, e.g. by using https://jsonlint.com/.
Step by step...
We will now send an REST API call to the actinia server and receive the server response.
Step 1:
Get your credentials (for authentication) from the trainer or use the demouser with
gu3st!pa55w0rd
Step 2:
Fo
rk
M
Step 3:
e
O
n
G
itL
Choose and launch your REST client: cURL or RESTman or ...
ab
a) cURL, on command line
b) Thunderclient plugin, for VS Code editor or find a plugin for your favorite editor
c) RESTman (manual), in Browser
For a curl example, see below ("REST actinia examples with curl").
Step 4:
Fo
rk
M
So far we have seen some REST basics and explored a few endpoints provided by actinia. Indeed
e
O
n
the structure of the endpoints follow some GRASS GIS concepts (compare the graphical
G
itL
introduction above), but this does not limit us much from processing "any" geospatial data.
ab
Exploring the API: finding available actinia endpoints
The actinia REST API documentation is available online (directly generated from the source code of
actinia). Check out some of the various sections in the actinia API docs:
Module Viewer
Process Chain Template Management
Authentication Management
API Log
Cache Management
File Management
Satellite Image Algorithms
Location Management
Mapset Management
Processing
Raster Management
Raster Statistics
STRDS Management (STRDS = space-time raster data set)
STRDS Sampling
STRDS Statistics
Vector Management
Mapsets
GeoNetwork
Resource Management
Process Chain Monitoring
Resource Iteration Management
User Management
To see a simple list of endpoints (and more), have a look at the "paths" section in the API JSON.
If the formatting looks "ugly", get the JSON Formatter extension.
Fo
rk
M
e
O
n
G
itL
ab
Fig. 5: actinia list of endpoints (in the "paths" section)
List of supported processes (> 500): see API modules (note: the process chain templates are at
bottom, category "actinia-module")
Fo
# note: no AUTH needed
rk
curl --no-progress-meter -X GET https://actinia.mundialis.de/api/v3/swagger.json | jq "."paths
M
[
e
O
"/actinia_modules",
n
"/actinia_modules/{actiniamodule}",
G
itL
"/actinia_templates",
ab
"/actinia_templates/{template_id}",
"/api_key",
"/api_log/{user_id}",
"/download_cache",
"/files",
"/grass_modules",
"/grass_modules/{grassmodule}",
"/landsat_process/{landsat_id}/{atcor_method}/{processing_method}",
"/landsat_query",
"/locations",
"/locations/{location_name}",
"/locations/{location_name}/info",
"/locations/{location_name}/mapsets",
"/locations/{location_name}/mapsets/{mapset_name}",
"/locations/{location_name}/mapsets/{mapset_name}/info",
...
"/sentinel2_query",
"/sentinel2a_aws_query",
"/token",
"/users",
"/users/{user_id}"
]
Here we use the command line and the curl software to communicate with the actinia server.
Optionally, to beautify the output, we use the jq command-line JSON processor which helps to
turn the output into something human readable (download jq).
Hint: If you have troubles to use jq on command line, you can also use it in a browser at
https://jqplay.org/: copy the JSON code into the "JSON" field, then a . into the "Filter" field and it
will show the result.
Preparation
To simplify our life in terms of server communication we store the credentials and REST server URL
in environmental variables (this is only relevant for command line usage; in RESTman the browser
will request the credentials):
First, we want to see the list of available "locations". A location in GRASS-speak is simply a project
folder which contains geospatial data:
Fo
rk
M
Next, we look at so-called "mapsets" which are subfolders in a location (just to better organise the
e
O
geospatial data):
n
G
itL
ab
# show available mapsets of a specific location
curl ${AUTH} -X GET "${actinia}/api/v3/locations/nc_spm_08/mapsets"
Eventually, digging more for content in "location" and "mapsets", we can look at the datasets stored
therein:
Vector data:
Raster data:
# show available raster maps in a specific location/mapset
Fo
curl ${AUTH} -X GET "${actinia}/api/v3/locations/nc_spm_08/mapsets/PERMANENT/raster_layers"
rk
curl ${AUTH} -X GET "${actinia}/api/v3/locations/nc_spm_08/mapsets/landsat/raster_layers"
M
curl ${AUTH} -X GET "${actinia}/api/v3/locations/nc_spm_08/mapsets/modis_lst/raster_layers"
e
O
n
# show metadata of a specific raster map
G
itL
curl ${AUTH} -X GET "${actinia}/api/v3/locations/nc_spm_08/mapsets/landsat/raster_layers/lsat7_
ab
Space-time raster datasets (STRDS):
It's time to retrieve something from the server. As a start, we will query the landuse and elevation
at a certain position (coordinates):
cat test_raster_sample_data.json
{
"points": [
[
"p1",
"638684.0",
"220210.0"
],
[
"p2",
"635676.0",
"226371.0"
]
]
}
Landuse query
Fo
rk
M
e
O
# we limit the output to "process_results" with jq and convert to CSV format
n
curl ${AUTH} --no-progress-meter -X POST -H "content-type: application/json" "${actinia}/api/v3
G
itL
0,1,2,3,4,5
ab
"easting","northing","site_name","elevation","elevation_label","elevation_color"
"638684","220210","","113.3384","","255:145:000"
"635676","226371","","147.6621","","094:067:039"
Next we want to query the stack of multitemporal datasets available and more specifically, retrieve
MODIS Land Surface Temperature (LST) values from the space-time cube at a specific position
(North Carolina data set; at 78W, 36N). For this, we use the endpoint sampling_sync_geojson:
# query point value in a STRDS, sending a GeoJSON file of the point position along with the req
# (North Carolina LST time series)
curl ${AUTH} -X POST -H "content-type: application/json" "${actinia}/api/v3/locations/nc_spm_08
change type of request from GET to POST, in the top of the page lefthand of the URL
set the right header content-type and application/json as value
add the JSON in the body section in RAW format
In the example above we have sent JSON code to the server directly in the request. However, with
longer process chains this is hard to manage. It is often much more convenient to store the JSON
code as "payload" in a file and send it to the server:
# note: you can easily generate such a GeoJSON file with ogr2ogr or v.out.ogr
Fo
#
rk
# store the query in a JSON file "pc_query_point_.json" (or use a text editor for this)
M
echo '{"type":"FeatureCollection","crs":{"type":"name","properties":{"name":"urn:ogc:def:crs:EP
e
O
n
# send JSON file as payload to query the STRDS
G
itL
curl ${AUTH} -X POST -H "content-type: application/json" "${actinia}/api/v3/locations/nc_spm_08
ab
Validation of a process chain
Why validation? It may happen that your JSON file to be sent to the endpoint contains a typo or
other invalid content. For the identification of problems prior to executing the commands contained
in the JSON file (which may last for hours), it is recommended to validate this file. For this, actinia
can be used as it provides a validation endpoint.
To turn a process chain back into command style notation, the validator can be used and the
relevant code extracted from the resulting JSON response. Download the process chain
process_chain_long.json and extract the section containing the underlying commands by parsing
the actinia server response with jq :
# command extraction from a process chain (using sync call) by parsing the 'process_results' re
curl ${AUTH} --no-progress-meter -H "Content-Type: application/json" -X POST "${actinia}/api/v3
[
"grass g.region ['raster=elevation@PERMANENT', 'res=10', '-p']",
"grass r.slope.aspect ['elevation=elevation@PERMANENT', 'format=degrees', 'precision=FCELL',
"grass r.watershed ['elevation=elevation@PERMANENT', 'convergence=5', 'memory=500', 'accumula
"grass r.info ['map=my_aspect', '-gr']"
]
Actinia can import from external Web resources, use data in the actinia server (persistent and
ephemeral storage) and make results available for download as Web resources. These latter can
then be downloaded, opened by QGIS, imported into GRASS GIS or other software. Note that the
download of Web resources provided by actinia requires authentication, e.g. the demouser .
Fo
rk
M
Dealing with workflows (processing chains)
e
O
n
G
itL
The overall goal is to "get stuff done". In this case it means that we can concatenate (chain) a series
ab
of commands where the output of one step may be used as the input of the following step.
To turn this concept into an example, we use again the process chain process_chain_long.json from
above and execute it, here using the asynchonous processing_async_export endpoint. By this, the
exporter in the process chain will be activated and deliver the computed maps as Web resources
for subsequent download:
Being an asynchronous process, the result is not offered directly but at the bottom of the JSON
output (in the terminal) a resource ID (red box) and a resource URI is shown:
Use this URI for retrieving the process status, e.g. using your browser ( F5 to reload page until job
is ready). Once the job has been completed ("Processing successfully finished"), three Web
resources (here: COG - Cloud Optimized GeoTIFF) are shown at the bottom of the JSON output:
# update the URI to that of your job, and be sure to use https:
Fo
curl ${AUTH} -X GET "https://actinia.mundialis.de/api/v3/resources/demouser/resource_id-284d42c
rk
M
...
e
O
"status": "finished",
n
"time_delta": 3.7403182983398438,
G
itL
"timestamp": 1580767679.525925,
ab
"urls": {
"resources": [
"http://actinia.mundialis.de/api/v3/resources/demouser/resource_id-284d42c7-9ba7-415d-b67
"http://actinia.mundialis.de/api/v3/resources/demouser/resource_id-284d42c7-9ba7-415d-b67
"http://actinia.mundialis.de/api/v3/resources/demouser/resource_id-284d42c7-9ba7-415d-b67
],
"status": "http://actinia.mundialis.de/api/v3/resources/demouser/resource_id-284d42c7-9ba7-
},
"user_id": "demouser"
}
The resulting files can now be downloaded (they'll remain for 24 hs on the server).
Controlling actinia from a running GRASS GIS session is a convenient way of writing process chains.
It requires some basic GRASS GIS knowledge (for an intro course, e.g. see here).
The ace (actinia command execution) tool allows the execution of a single GRASS GIS command
or a list of GRASS GIS commands on an actinia REST service (e.g., https://actinia.mundialis.de/). In
addition it provides job management, the ability to list locations, mapsets and map layer the user
has access to as well as the creation and deletion of mapsets. The ace tool is a GRASS GIS addon
and must be executed in an active GRASS GIS session.
All commands will be executed per default in an ephemeral database on the actinia server. Hence,
generated output must be exported using augmented GRASS commands to be able to further use
it.
Note: The option mapset=MAPSET_NAME allows the execution of commands in the persistent user
Fo
database. It can be used with option.
rk
location=LOCATION_NAME
M
e
O
Preparation
n
G
itL
ab
To use ace, some things are required:
In case not yet present on the system, the following Python libraries are needed:
You need to run the installer steps in a running GRASS GIS session:
To try out ace , start GRASS GIS with the nc_spm_08 North Carolina sample location. You can
download it easily through the Download button in the graphical startup (recommended; see Fig. 9)
or from grass.osgeo.org/download/sample-data/.
Fig. 9: Download and extraction of nc_spm_08 North Carolina sample location ("Complete NC
location")
Before starting GRASS GIS with the downloaded location create a new mapset "ace" in nc_spm_08 .
Fo
rk
M
e
Note: Since we want to do cloud computing,
O
n
the full location would not be needed but it is
G
itL
useful to have for an initial exercise in order to
ab
compare local and remote computations.
Authentication settings
The user must setup the following environmental variables to specify the actinia server and
credentials:
North Carolina sample dataset (NC State-Plane metric CRS, EPSG: 3358):
base cartography ( nc_spm_08/PERMANENT ; source: https://grassbook.org/datasets/datasets-
3rd-edition/)
Landsat subscenes ( nc_spm_08/landsat ; source: https://grass.osgeo.org/download/sample-
data/)
MODIS LST time series ( nc_spm_08/modis_lst ; source: NASA)
Fo
base).
rk
M
e
O
List locations, mapsets and maps
n
G
itL
ab
In order to list the locations the user has access to, run
ace -l
['latlong_wgs84', 'nc_spm_08', 'ECAD']
The following command lists mapsets of current location in the active GRASS GIS session
("nc_spm_08"):
All following commands can be executed in any active GRASS GIS location, since the location name
at the actinia server is explicitly provided. In case the location option is not provided, the active
location will be used. The following command lists mapsets of the provided location latlong_wgs84:
ace location="latlong_wgs84" -m
['PERMANENT', 'Sentinel2A', 'globcover', 'modis_ndvi_global']
but only if the actinia user is enabled; otherwise the following message appears:
ace location="latlong_wgs84" -m
{'message': "{'Status': 'error', 'Messages': 'Unauthorized access to location "
"<latlong_wgs84>'}"}
To list all raster maps available in the specified mapset belonging to the provided location
nc_spm_08, run:
To list all vector maps available in the specified mapset belonging to the current or a provided
location, run:
ace location="nc_spm_08" mapset="PERMANENT" -v
Fo
['P079214',
rk
...
M
'boundary_county',
e
O
'boundary_municp',
n
'bridges',
G
itL
'busroute1',
ab
'busroute11',
...
'urbanarea',
'usgsgages',
'zipcodes_wake']
List all raster maps in a location/mapset different from the current GRASS GIS session location:
A great feature is the possibility to import remote datasets on the fly. This means that a raster or
vector file may be retrieved through a URL specified in a command by adding it to the input map
name. There are two options:
1) use of the importer addon (required for multispectral data; works for raster and vector data).
Example:
...
importer raster=ortho2010@https://apps.mundialis.de/workshops/osgeo_ireland2017/north_carolina/
...
2) use of an "augmented" (GRASS GIS notion enhanced by actinia... smiley) map names by
specifying the URL with the @ operator to import a web located resource. Example:
...
g.region raster=elev@https://storage.googleapis.com/graas-geodata/elev_ned_30m.tif -ap
...
Job management
The ace tool can list jobs, choose from all , accepted , running , terminated , finished , error .
Show finished job(s) (note: the actual response may differ):
Fo
rk
M
e
ace list_jobs="finished"
O
resource_id-7a94b416-6f19-40c0-96c2-e62ce133ff89 finished 2018-12-17 11:33:58.965602
n
G
resource_id-87965ced-7242-43d2-b6da-5ded47b10702 finished 2018-12-18 08:45:29.959495
itL
resource_id-b633740f-e0c5-4549-a663-9d58f9499531 finished 2018-12-18 08:52:36.669777
ab
resource_id-0f9d6382-b8d2-4ff8-b41f-9b16e4d6bfe2 finished 2018-12-17 11:14:00.283710
...
ace list_jobs="running"
resource_id-30fff8d6-5294-4f03-a2f9-fd7c857bf153 running 2018-12-18 21:58:04.107389
ace info_job="resource_id-b1cf32e3-bf07-4b57-858e-5d6a9767dd63"
{'accept_datetime': '2019-09-03 00:50:11.725229',
'accept_timestamp': 1567471811.7252264,
'api_info': {'endpoint': 'rasterlayersresource',
'method': 'GET',
'path': '/api/v3/locations/nc_spm_08/mapsets/new_user_mapset/raster_layers',
'request_url': 'http://actinia.mundialis.de/api/v3/locations/nc_spm_08/mapsets/ne
'datetime': '2019-09-03 00:50:11.813833',
'http_code': 200,
'message': 'Processing successfully finished',
'process_chain_list': [{'1': {'inputs': {'mapset': 'new_user_mapset',
'type': 'raster'},
'module': 'g.list'}}],
'process_log': [{'executable': 'g.list',
'parameter': ['mapset=new_user_mapset', 'type=raster'],
'return_code': 0,
'run_time': 0.0502924919128418,
'stderr': [''],
'stdout': ''}],
'process_results': [],
'progress': {'num_of_steps': 1, 'step': 1},
'resource_id': 'resource_id-b1cf32e3-bf07-4b57-858e-5d6a9767dd63',
'status': 'finished',
'time_delta': 0.08862900733947754,
'timestamp': 1567471811.813823,
'urls': {'resources': [],
'status': 'http://actinia.mundialis.de/api/v3/resources/demouser/resource_id-b1cf32e3
'user_id': 'demouser'}
To generate the actinia process chain JSON request simply add the -d (dry-run) flag:
Fo
"version": "1",
rk
"list": [
M
{
e
O
"module": "r.slope.aspect",
n
"id": "r.slope.aspect_1804289383",
G
itL
"inputs": [
ab
{
"param": "elevation",
"value": "elevation"
},
{
"param": "format",
"value": "degrees"
},
{
"param": "precision",
"value": "FCELL"
},
{
"param": "zscale",
"value": "1.0"
},
{
"param": "min_slope",
"value": "0.0"
}
],
"outputs": [
{
"param": "slope",
"value": "myslope"
}
]
}
]
}
It is very easy and fast to render a map (note: the "demouser" is not enabled for this):
Fo
rk
M
Ephemeral processing is the default processing approach of actinia. Each single command or all
e
O
n
commands in a shell script, will be executed in an ephemeral mapset. This mapset will be removed
G
itL
after processing. The output of GRASS GIS modules can be marked for export, to store the
ab
computational result for download and further analysis.
Running the module g.list in the location defined by the active GRASS GIS session in an
ephemeral mapset, that has only the PERMANENT mapset in its search path:
{'resources': [],
'status': 'https://actinia.mundialis.de/api/v3/resources/demouser/resource_id-db96cd83-dbc2-40
Running the module g.region in a new ephemeral location, to show the default region of a
temporary mapset:
ace location="nc_spm_08" grass_command="g.region -p"
Fo
rk
Resource status accepted
M
Polling: https://actinia.mundialis.de/api/v3/resources/demouser/resource_id-b398b4dd-a47c-4443-
e
O
Resource poll status: finished
n
Processing successfully finished
G
itL
Resource status finished
ab
--------------------------------------------------------------------------
projection: 99 (Lambert Conformal Conic)
zone: 0
datum: nad83
ellipsoid: a=6378137 es=0.006694380022900787
north: 320000
south: 10000
west: 120000
east: 935000
nsres: 500
ewres: 500
rows: 620
cols: 1630
cells: 1010600
{'resources': [],
'status': 'https://actinia.mundialis.de/api/v3/resources/demouser/resource_id-b398b4dd-a47c-44
Script examples
The following commands (to be stored in a script and executed with ace ) will import a raster layer
from an internet source as raster map elev , sets the computational region to the map and
computes the slope. Additional information about the raster layer are requested with r.info .
# grass ~/grassdata/nc_spm_08/user1/
# Import the web resource `elev_ned_30m.tif` and set the region to the imported map
g.region raster=elev@https://storage.googleapis.com/graas-geodata/elev_ned_30m.tif -ap
# Compute univariate statistics
r.univar map=elev
r.info elev
# Compute the slope of the imported map and mark it for export as geotiff file
r.slope.aspect elevation=elev slope=slope_elev+GTiff
r.info slope_elev
Just for inspection, to generate the actinia process chain JSON request add the -d (dry-run) flag:
Fo
"version": "1",
rk
"list": [
M
{
e
O
"module": "g.region",
n
"id": "g.region_1804289383",
G
itL
"flags": "pa",
ab
"inputs": [
{
"import_descr": {
"source": "https://storage.googleapis.com/graas-geodata/elev_ned_30m.tif",
"type": "raster"
},
"param": "raster", "value": "elev"
}
]
},
{
"module": "r.univar",
"id": "r.univar_1804289383",
"inputs": [
{"param": "map", "value": "elev"},
{"param": "percentile", "value": "90"},
{"param": "separator", "value": "pipe"}
]
},
{
"module": "r.info",
"id": "r.info_1804289383",
"inputs": [{"param": "map", "value": "elev"}]
},
{
"module": "r.slope.aspect",
"id": "r.slope.aspect_1804289383",
"inputs": [
{"param": "elevation", "value": "elev"},
{"param": "format", "value": "degrees"},
{"param": "precision", "value": "FCELL"},
{"param": "zscale", "value": "1.0"},
{"param": "min_slope", "value": "0.0"}
],
"outputs": [
{
"export": {"format": "GTiff", "type": "raster"},
"param": "slope", "value": "slope_elev"
}
]
},
{
"module": "r.info",
"id": "r.info_1804289383",
"inputs": [{"param": "map", "value": "slope_elev"}]
}
]
}
To eventually execute the saved script on the actinia server (it will internally convert the script to
JSON and send this as a payload to the server), run:
Fo
# Import the web resource and set the region to the imported map
rk
# we apply a trick for the import of multi-band GeoTIFFs:
M
# install with: g.extension importer url=https://github.com/mundialis/importer/
e
O
importer raster=ortho2010@https://apps.mundialis.de/workshops/osgeo_ireland2017/north_carolina/
n
# The importer has created three new raster maps, one for each band in the geotiff file
G
itL
# stored them in an image group
ab
r.info map=ortho2010.1
r.info map=ortho2010.2
r.info map=ortho2010.3
# Set the region and resolution
g.region raster=ortho2010.1 res=1 -p
# Note: the RGB bands are organized as a group
i.segment group=ortho2010 threshold=0.25 output=ortho2010_segment_25+GTiff goodness=ortho2010_s
# Finally vectorize segments with r.to.vect and export as a GeoJSON file
r.to.vect input=ortho2010_segment_25 type=area output=ortho2010_segment_25+GeoJSON
The results are provided as REST resources for download or consumtion in other systems.
GRASS GIS commands can be executed in a user specific persistent database. The user must create
a mapset in an existing location. This mapsets can be accessed via ace . All processing results of
commands run in this mapset, will be stored persistently. Be aware that the processing will be
performed in an ephemeral database that will be moved to the persistent storage using the correct
name after processing.
To create a new mapset in the nc_spm_08 location with the name test_mapset the following
command must be executed (note: the "demouser" is not enabled for this):
Run the commands from the statistic script in the new persistent mapset:
Show all raster maps that have been created with the script in test_mapset:
Fo
rk
M
e
If the active GRASS GIS session has identical location/mapset settings, then an alias can be used to
O
n
avoid the persistent option in each single command call:
G
itL
ab
alias acp="ace mapset=`g.mapset -p`"
We assume that in the active GRASS GIS session the current location is nc_spm_08 and the current
mapset is test_mapset. Then the commands from above can be executed in the following way:
See: https://github.com/mundialis/actinia_core/blob/master/scripts/curl_commands.sh
Own exercises in actinia
Fo
rk
M
e
Meanwhile you have seen a lot of material. Time to try out some further exercises...
O
n
G
itL
EXERCISE: "Population at risk near coastal areas"
ab
needed geodata:
SRTM 30m (already available in actinia - find out the location yourself)
Global Population 2015 (already available in actinia - find out the location yourself)
vector shorelines (get from naturalearthdata)
(draft idea only, submit your suggestion to trainer how to solve this task)
proposed workflow:
actinia "ace" importer for building footprint upload
v.buffer of 10m and 30m around footprints
filter NDVI threshold > 0.6 (map algebra) to get the tree pixels - more exiting would be a ML
approach (with previously prepared training data ;-)) ( r.learn.ml offers RF and SVM)
on binary tree map (which corresponds to risk exposure)
count number of tree pixels in 5x5 moving window ( r.neighbors with method "count")
compute property risk statistics using buffers and tree count map and upload to buffered
building map ( v.rast.stats , method=maximum)
export of results through REST resources
Fo
https://github.com/mundialis/actinia_core/
rk
M
e
O
See also
n
G
itL
ab
What does actinia mean?
Actinia is a beautiful sea creature and a genus of sea anemones in the family Actiniidae (see
wikipedia). While the sea creature is filtering the sea water, the actinia geoprocessing platform
filters in large data oceans.
actinia Wiki
Citing actinia (DOI: 10.5281/zenodo.5879231)
openEO resources
Server: https://openeo.mundialis.de
user, password: upon request
REST introduction
Git for Windows offers the "git bash" and common tools
MSYS2 offers a bash and many tools along with pacman to install further packages
References
[1] Zell Liew, 2018: Understanding And Using REST APIs,
https://www.smashingmagazine.com/2018/01/understanding-using-rest-api/
Fo
rk
M
e
The review by Vero Andreo is greatly appreciated.
O
n
G
itL
ab
Repository of this material on gitlab
About | Privacy