20 Nov

Get angle/orientation/direction of GPS track using R

Hey,

Today a bit of trSIGonometry with this function to compute the angle/orientation/direction between 2 points of a GPX file.

############################
#Function
############################
 
angle <- function(pt1, pt2) {
  conv = 180 / pi # conversion radians / degrees
  diff <- pt2 - pt1
  diff.abs <- abs(diff)
  if (diff[1]>=0 & diff[2]>=0){angle <- pi/2 - atan(diff.abs[2] / diff.abs[1])}
  if (diff[1]>=0 & diff[2]<=0){angle <- pi/2 + atan(diff.abs[2] / diff.abs[1])}
  if (diff[1]<=0 & diff[2]>=0){angle <- 3*pi/2 + atan(diff.abs[2] / diff.abs[1])}
  if (diff[1]<=0 & diff[2]<=0){angle <- pi + (pi/2- atan(diff.abs[2] / diff.abs[1]))}
  return(angle * conv)
}
 
############################
#Script
############################
 
library(rgdal)
library(raster)
library(data.table)
 
#Set folder
setwd("/home/GuruBlard/gpx/")
 
# Load file names
filenames <- list.files(pattern=".gpx$")
 
# Iteration through all gpx files
for (filename in filenames){
  # Load the GPX
 
  #ogrListLayers("export.gpx")
  gpx.trackpoints <- readOGR(dsn = filename,  layer= "track_points", stringsAsFactors = F)
 
  # Extract the coordinates as a matrix:
  gpx.tp <- coordinates(gpx.trackpoints)
 
  # #Loop through and add angle - the difference between the of the current point and the following one:
  for (tp in 1:(nrow(gpx.tp)-1)) {
    gpx.trackpoints$angle[tp] <- angle(gpx.tp[tp,], gpx.tp[tp+1,])
  }
 
  gpx.trackpoints@data$Date <- substr(as.character(gpx.trackpoints@data$time), 1,10)
  gpx.trackpoints@data$Hour <- substr(as.character(gpx.trackpoints@data$time),12,19)
  gpx.trackpoints@data$filename <- filename
  writeOGR(gpx.trackpoints, paste0(tools::file_path_sans_ext(filename), ".shp"), layer=tools::file_path_sans_ext(filename), driver="ESRI Shapefile")
}

Screenshot from 2017-11-20 14-53-09

22 Mar

Download files from Dropbox via shareable link

Hello there!

I have a nice piece of code for today on how to download a file from a dropbox shareable link (I reckon it adapted slightly a code found here). Here is how it works. Argument x is the document name, d the document key, and outfile is the desired filename and location.

dl_from_dropbox <- function(x, key, outfile) {
  require(RCurl)
  bin <- getBinaryURL(paste0("https://dl.dropboxusercontent.com/s/", key, "/", x),
                      ssl.verifypeer = FALSE)
  con <- file(outfile, open = "wb")
  writeBin(bin, con)
  close(con)        
}
 
# Example:
dl_from_dropbox("GViewer_Embeds.txt", "06fqlz6gswj80nj", '/home/GViewer_Embeds.txt')
15 May

Load googlesheet data in R

Dear R-addict,

At the era of cloud-based computing, functions such as read.csv, read.xls or even fread are really old-fashioned.
Here is a chunk of code that will allow you to load data from a googlesheet:

 
install.packages("gsheet")
library(gsheet)
# Download a sheet
 
# Download a sheet
url <- 'https://docs.google.com/spreadsheets/d/1XBs7p44-djCPmN4TnPEgboUVAdB2mChbAlCjqnVOyQ0'
a <- gsheet2tbl(url)
12 May

Download daymet data with R

Hey folks ! It’s been a while !

Interested in meteorological data over US ? Here is a way to easily download Daymet data inspired from DaymetR package.

Daymet is a collection of algorithms and computer software designed to interpolate and extrapolate from daily meteorological observations to produce gridded estimates of daily weather parameters. Weather parameters generated include daily surfaces of minimum and maximum temperature, precipitation, humidity, and radiation produced on a 1 km x 1 km gridded surface over the conterminous United States, Mexico, and Southern Canada.

Download the “tiles” shapefile here.

setwd("/guru-blard/DAYMET")
 
param = c("vp", "tmin", "tmax", "swe", "srad", "prcp", "dayl") #see here https://daymet.ornl.gov/overview.html for the available variable
tiles=c(11922:11925, 11742:11745) #id of the tiles of interest (cfr shapefile)
year_range=1985:2015
 
for (i in year_range) {
  for (j in tiles) {
    for (k in param) {
      download_string = sprintf("https://daymet.ornl.gov/thredds/fileServer/ornldaac/1219/tiles/%s/%s_%s/%s.nc", 
                                i, j, i, k)
      daymet_file = paste(k, "_", i, "_", j, ".nc", 
                          sep = "")
      cat(paste("Downloading DAYMET data for tile: ", 
                j, "; year: ", i, "; product: ", k, "\n", 
                sep = ""))
      try(download(download_string, daymet_file, quiet = TRUE, 
                   mode = "wb"), silent = FALSE)
    }
  }
}
21 Jan

Dowload OpenStreetMap (OSM) data and read it in QGIS

Happy New GIS,

Be sure to get the last version of GDAL (Ubuntu installation)

Go to Geofabrik.
Download the compressed .osm.pbf files corresponding to our area (Continent, Country).

Using ogr to convert the pbf file into a SQlite database which can be used directly in QGIS with Add SpatiaLite Layer.

ogr2ogr -f "SQLite" -dsco SPATIALITE=YES -spat xmin ymin xmax ymax GuruBlard_Region.db OSM_continent.pbf

where xmin ymin xmax ymax:
spatial query extents, in the SRS of the source layer(s) (or the one specified with -spat_srs). Only features whose geometry intersects the extents will be selected. The geometries will not be clipped unless -clipsrc is specified

For further information, visit this post of Anita Graser.

31 Aug

Merging two dataframe while keeping the original rows’ order

Hello GIStouille,

Here is a nice function to merge two dataframe while keeping the original rows’ order (source).
keep_order=1 keeps order of the first dataframe (x).
keep_order=2 keeps order of the second dataframe (y).

merge.with.order <- function(x,y, ..., sort = T, keep_order)
{
  # this function works just like merge, only that it adds the option to return the merged data.frame ordered by x (1) or by y (2)
  add.id.column.to.data <- function(DATA)
  {
    data.frame(DATA, id... = seq_len(nrow(DATA)))
  }
  # add.id.column.to.data(data.frame(x = rnorm(5), x2 = rnorm(5)))
  order.by.id...and.remove.it <- function(DATA)
  {
    # gets in a data.frame with the "id..." column.  Orders by it and returns it
    if(!any(colnames(DATA)=="id...")) stop("The function order.by.id...and.remove.it only works with data.frame objects which includes the 'id...' order column")
 
    ss_r <- order(DATA$id...)
    ss_c <- colnames(DATA) != "id..."
    DATA[ss_r, ss_c]
  }
 
  # tmp <- function(x) x==1; 1	# why we must check what to do if it is missing or not...
  # tmp()
 
  if(!missing(keep_order))
  {
    if(keep_order == 1) return(order.by.id...and.remove.it(merge(x=add.id.column.to.data(x),y=y,..., sort = FALSE)))
    if(keep_order == 2) return(order.by.id...and.remove.it(merge(x=x,y=add.id.column.to.data(y),..., sort = FALSE)))
    # if you didn't get "return" by now - issue a warning.
    warning("The function merge.with.order only accepts NULL/1/2 values for the keep_order variable")
  } else {return(merge(x=x,y=y,..., sort = sort))}
}
25 Aug

How to convert hdf files in tif?

Hi crazyGISfans,

Hdf format may be sometimes difficult to manage. We prefer working with geotif.
How do you do convert hdf in tif in R with gdal_translate?

library(gdalUtils)
library(tools)
 
setwd("/guru-blard/")
 
files<-list.files(, pattern=".hdf$")
band<-1
 
for (file in files){
  gdal_translate(src_dataset=get_subdatasets(file)[band], dst_dataset=paste0(file_path_sans_ext(file),"_B", band,'.tif'))
}

That’s it.

03 Apr

How to speed up R code?

Dear codeRs,

I recently found a very comprehensive document on how to speed up R code and wanted to share it with you!
It covers various aspects such as identifying where is your code slowing down, parallel computing, interfacing C code with R etc.
Don’t hesitate and go have a look here

02 Apr

Extract raster values in parallel

Dear fellow Remote Senseis,

Here is a code chunk that will help you speed up the processing time when you want to extract raster values at specific locations.
It uses the a package called snowfall which allows parallel processing.
As usual, please find here under a dummy example. As there is only 3 bands it the stack, you may not see the positive effect of parallel computing…try with a 100-layer stack and you’ll see.

Cheers

install.packages("snowfall")
 
library(snowfall)
library(raster)
 
ncpus <- 3
 
r <- (stack(system.file("external/rlogo.grd", package="raster")))
inSlist <- unstack(r) # convert to list of single raster layers
 
rId <- subset(r,1)
idPix <- which(rId[]>200)
 
sfInit(parallel=TRUE,cpus=ncpus)
sfLibrary(raster)
sfLibrary(rgdal)
a <- sfSapply(inSlist,extract,y=idPix)
sfStop()
27 Mar

Calculate hillshade with gdaldem based on the elevation and azimuth angle calculated from the acquisition time

Hi GIS fellows ! Here is a spring present, let’s see how to simply create an hillshade based on the acquisition date and time !

sunPosition <- function(year, month, day, hour=12, min=0, sec=0,
                        lat=46.5, long=6.5) {
 
  twopi <- 2 * pi
  deg2rad <- pi / 180
 
  # Get day of the year, e.g. Feb 1 = 32, Mar 1 = 61 on leap years
  month.days <- c(0,31,28,31,30,31,30,31,31,30,31,30)
  day <- day + cumsum(month.days)[month]
  leapdays <- year %% 4 == 0 & (year %% 400 == 0 | year %% 100 != 0) & 
    day >= 60 & !(month==2 & day==60)
  day[leapdays] <- day[leapdays] + 1
 
  # Get Julian date - 2400000
  hour <- hour + min / 60 + sec / 3600 # hour plus fraction
  delta <- year - 1949
  leap <- trunc(delta / 4) # former leapyears
  jd <- 32916.5 + delta * 365 + leap + day + hour / 24
 
  # The input to the Atronomer's almanach is the difference between
  # the Julian date and JD 2451545.0 (noon, 1 January 2000)
  time <- jd - 51545.
 
  # Ecliptic coordinates
 
  # Mean longitude
  mnlong <- 280.460 + .9856474 * time
  mnlong <- mnlong %% 360
  mnlong[mnlong < 0] <- mnlong[mnlong < 0] + 360
 
  # Mean anomaly
  mnanom <- 357.528 + .9856003 * time
  mnanom <- mnanom %% 360
  mnanom[mnanom < 0] <- mnanom[mnanom < 0] + 360
  mnanom <- mnanom * deg2rad
 
  # Ecliptic longitude and obliquity of ecliptic
  eclong <- mnlong + 1.915 * sin(mnanom) + 0.020 * sin(2 * mnanom)
  eclong <- eclong %% 360
  eclong[eclong < 0] <- eclong[eclong < 0] + 360
  oblqec <- 23.439 - 0.0000004 * time
  eclong <- eclong * deg2rad
  oblqec <- oblqec * deg2rad
 
  # Celestial coordinates
  # Right ascension and declination
  num <- cos(oblqec) * sin(eclong)
  den <- cos(eclong)
  ra <- atan(num / den)
  ra[den < 0] <- ra[den < 0] + pi
  ra[den >= 0 & num < 0] <- ra[den >= 0 & num < 0] + twopi
  dec <- asin(sin(oblqec) * sin(eclong))
 
  # Local coordinates
  # Greenwich mean sidereal time
  gmst <- 6.697375 + .0657098242 * time + hour
  gmst <- gmst %% 24
  gmst[gmst < 0] <- gmst[gmst < 0] + 24.
 
  # Local mean sidereal time
  lmst <- gmst + long / 15.
  lmst <- lmst %% 24.
  lmst[lmst < 0] <- lmst[lmst < 0] + 24.
  lmst <- lmst * 15. * deg2rad
 
  # Hour angle
  ha <- lmst - ra
  ha[ha < -pi] <- ha[ha < -pi] + twopi
  ha[ha > pi] <- ha[ha > pi] - twopi
 
  # Latitude to radians
  lat <- lat * deg2rad
 
  # Azimuth and elevation
  el <- asin(sin(dec) * sin(lat) + cos(dec) * cos(lat) * cos(ha))
  az <- asin(-cos(dec) * sin(ha) / cos(el))
 
  # For logic and names, see Spencer, J.W. 1989. Solar Energy. 42(4):353
  cosAzPos <- (0 <= sin(dec) - sin(el) * sin(lat))
  sinAzNeg <- (sin(az) < 0)
  az[cosAzPos & sinAzNeg] <- az[cosAzPos & sinAzNeg] + twopi
  az[!cosAzPos] <- pi - az[!cosAzPos]
 
 
  el <- el / deg2rad
  az <- az / deg2rad
  lat <- lat / deg2rad
 
  return(list(elevation=el, azimuth=az))
}
 
 
 
# calculate the sun position from date and time
position<-sunPosition(2012, 05, 25, hour=15, min=40, sec=42.460,lat=d_wgs@coords[2], long=d_wgs@coords[1])
 
system(paste('gdaldem hillshade',
             '-of GTiff',
             '-az',position$azimuth,
             '-alt',position$elevation,
             paste('/path_to_data/dtm_file','.tif',sep=''),
             paste('/path_to_data/dtm_file','_hillshade.tif',sep='')))