10 Nov

Count points in polygons

Hi guys,

Here is a quick r-snippet to count points inside polygons:

library("raster")
 
x <- getData('GADM', country='ITA', level=1)
class(x)
# [1] "SpatialPolygonsDataFrame"
# attr(,"package")
# [1] "sp"
 
set.seed(1)
# sample random points
p <- spsample(x, n=300, type="random")
p <- SpatialPointsDataFrame(p, data.frame(id=1:300))
 
proj4string(x)
# [1] " +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs +towgs84=0,0,0"
proj4string(p)
# [1] " +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs +towgs84=0,0,0"
 
plot(x)
plot(p, col="red" , add=TRUE)

Here is the figure:
screenshot-from-2016-11-10-165353

res <- over(p, x)
table(res$NAME_1) # count points
#               Abruzzo                Apulia            Basilicata
#                    11                    20                     9
#              Calabria              Campania        Emilia-Romagna
#                    16                     8                    25
# Friuli-Venezia Giulia                 Lazio               Liguria
#                     7                    14                     7
#             Lombardia                Marche                Molise
#                    22                     4                     3
#              Piemonte              Sardegna                Sicily
#                    35                    18                    21
#               Toscana   Trentino-Alto Adige                Umbria
#                    33                    15                     6
#         Valle d'Aosta                Veneto
#                     4                    22
19 Feb

Color a raster with gdal using gdaldem

Hi Gigis,
Have you never look for a simple way to color a raster without manualy modifying the symbology or use a python/r script to build a RAT?
Here is the solution for lazy gis people like me 😉
My example is a raster Geotif with 0 for land, 1 for water and 255 for no data. Most of the time, when you open it in a GIS, you have black overview because the contrast is adjusted on the 255 value. Here is how to proceed:

1/ Write a simple text file colortable.txt such as:
0 white
1 blue
255 grey
2/gdaldem color-relief -of VRT sourcefile.tif colortable.txt out.vrt destfile.tif

3/ Just open the VRT, and the magic is there!

Khan-Guru, the Raster Magician

12 Aug

Share a legend between multiple plots using grid.arrange

Greetings, followeRs!

I wanted to share a really cool function to create a multiplot with a shared legend!
I was created by Hadley Wickham, don’t hesitate to go and check his other awesome functions.

library(ggplot2)
library(gridExtra)
 
grid_arrange_shared_legend <- function(...) {
    plots <- list(...)
    g <- ggplotGrob(plots[[1]] + theme(legend.position="bottom"))$grobs
    legend <- g[[which(sapply(g, function(x) x$name) == "guide-box")]]
    lheight <- sum(legend$height)
    grid.arrange(
        do.call(arrangeGrob, lapply(plots, function(x)
            x + theme(legend.position="none"))),
        legend,
        ncol = 1,
        heights = unit.c(unit(1, "npc") - lheight, lheight))
}
 
dsamp <- diamonds[sample(nrow(diamonds), 1000), ]
p1 <- qplot(carat, price, data=dsamp, colour=clarity)
p2 <- qplot(cut, price, data=dsamp, colour=clarity)
p3 <- qplot(color, price, data=dsamp, colour=clarity)
p4 <- qplot(depth, price, data=dsamp, colour=clarity)
grid_arrange_shared_legend(p1, p2, p3, p4)
08 Jun

Assemble png into a grid using ImageMagick and R

Holà GISpa,

Here is a R function that uses the montage function of ImageBrick to assemble png in one grid. All the parameters are explained here.

pattern='CoolestPngEver_'
In<-list.files("/export/homes/gurublard/", full.names=T, pattern=pattern)
Out<-paste0(/export/homes/gurublard/", pattern,"all.png")
command<-paste0("montage -label '%f' -density 300 -tile 4x0 -geometry +5+50 -border 10 ", paste(In, collapse=" "), " ", Out)
system(command)

ImageMagick

21 Apr

Sampling a raster in a class-equalized fashion

Dear followers,

Here is a function that would allow to sample randomly each class of a raster equally given the minor class in the landscape.

rSampleEqualClass <- function(r,prop=1){
 
  if((prop>1) | (prop<0) ){
    print('prop value outside bouds')
  } else {
    # Identify n, the number of pixels to extract by class
    count <- as.matrix(table(r[]))
    n <- ceiling(prop*min(count))
 
    # extract the random pixels
    l <- as.data.frame(sampleStratified(r,size=n))
    colnames(l)<-c("cell",'values')
 
    # create new raster with the selected pixels
    r.out <- r
    r.out[]<-NA
    r.out[l$cell] <- l$values
 
 
  }
  return(r.out)
}

And here a small example to demonstrate its use:

library(raster)
 
r <- raster(system.file("external/rlogo.grd", package="raster"))
r[r<=105]<-0
r[r>105]<-1
plot(r)
 
for(i in 1:10){
  print(i)
  plot(rSampleEqualClass(r,prop=0.8),main=i)
}
27 Mar

Calculate hillshade with gdaldem based on the elevation and azimuth angle calculated from the acquisition time

Hi GIS fellows ! Here is a spring present, let’s see how to simply create an hillshade based on the acquisition date and time !

sunPosition <- function(year, month, day, hour=12, min=0, sec=0,
                        lat=46.5, long=6.5) {
 
  twopi <- 2 * pi
  deg2rad <- pi / 180
 
  # Get day of the year, e.g. Feb 1 = 32, Mar 1 = 61 on leap years
  month.days <- c(0,31,28,31,30,31,30,31,31,30,31,30)
  day <- day + cumsum(month.days)[month]
  leapdays <- year %% 4 == 0 & (year %% 400 == 0 | year %% 100 != 0) & 
    day >= 60 & !(month==2 & day==60)
  day[leapdays] <- day[leapdays] + 1
 
  # Get Julian date - 2400000
  hour <- hour + min / 60 + sec / 3600 # hour plus fraction
  delta <- year - 1949
  leap <- trunc(delta / 4) # former leapyears
  jd <- 32916.5 + delta * 365 + leap + day + hour / 24
 
  # The input to the Atronomer's almanach is the difference between
  # the Julian date and JD 2451545.0 (noon, 1 January 2000)
  time <- jd - 51545.
 
  # Ecliptic coordinates
 
  # Mean longitude
  mnlong <- 280.460 + .9856474 * time
  mnlong <- mnlong %% 360
  mnlong[mnlong < 0] <- mnlong[mnlong < 0] + 360
 
  # Mean anomaly
  mnanom <- 357.528 + .9856003 * time
  mnanom <- mnanom %% 360
  mnanom[mnanom < 0] <- mnanom[mnanom < 0] + 360
  mnanom <- mnanom * deg2rad
 
  # Ecliptic longitude and obliquity of ecliptic
  eclong <- mnlong + 1.915 * sin(mnanom) + 0.020 * sin(2 * mnanom)
  eclong <- eclong %% 360
  eclong[eclong < 0] <- eclong[eclong < 0] + 360
  oblqec <- 23.439 - 0.0000004 * time
  eclong <- eclong * deg2rad
  oblqec <- oblqec * deg2rad
 
  # Celestial coordinates
  # Right ascension and declination
  num <- cos(oblqec) * sin(eclong)
  den <- cos(eclong)
  ra <- atan(num / den)
  ra[den < 0] <- ra[den < 0] + pi
  ra[den >= 0 & num < 0] <- ra[den >= 0 & num < 0] + twopi
  dec <- asin(sin(oblqec) * sin(eclong))
 
  # Local coordinates
  # Greenwich mean sidereal time
  gmst <- 6.697375 + .0657098242 * time + hour
  gmst <- gmst %% 24
  gmst[gmst < 0] <- gmst[gmst < 0] + 24.
 
  # Local mean sidereal time
  lmst <- gmst + long / 15.
  lmst <- lmst %% 24.
  lmst[lmst < 0] <- lmst[lmst < 0] + 24.
  lmst <- lmst * 15. * deg2rad
 
  # Hour angle
  ha <- lmst - ra
  ha[ha < -pi] <- ha[ha < -pi] + twopi
  ha[ha > pi] <- ha[ha > pi] - twopi
 
  # Latitude to radians
  lat <- lat * deg2rad
 
  # Azimuth and elevation
  el <- asin(sin(dec) * sin(lat) + cos(dec) * cos(lat) * cos(ha))
  az <- asin(-cos(dec) * sin(ha) / cos(el))
 
  # For logic and names, see Spencer, J.W. 1989. Solar Energy. 42(4):353
  cosAzPos <- (0 <= sin(dec) - sin(el) * sin(lat))
  sinAzNeg <- (sin(az) < 0)
  az[cosAzPos & sinAzNeg] <- az[cosAzPos & sinAzNeg] + twopi
  az[!cosAzPos] <- pi - az[!cosAzPos]
 
 
  el <- el / deg2rad
  az <- az / deg2rad
  lat <- lat / deg2rad
 
  return(list(elevation=el, azimuth=az))
}
 
 
 
# calculate the sun position from date and time
position<-sunPosition(2012, 05, 25, hour=15, min=40, sec=42.460,lat=d_wgs@coords[2], long=d_wgs@coords[1])
 
system(paste('gdaldem hillshade',
             '-of GTiff',
             '-az',position$azimuth,
             '-alt',position$elevation,
             paste('/path_to_data/dtm_file','.tif',sep=''),
             paste('/path_to_data/dtm_file','_hillshade.tif',sep='')))
05 Dec

Plot heat maps for correlations, confusion matrices etc.

Hi folks!

Wanna present your table in a nice way? Have you ever thought of heat maps?
Here is an example on how to use them:

library(ggplot2)
library(plyr) # might be not needed here anyway it is a must-have package I think in R 
library(reshape2) # to "melt" your dataset
library (scales) # it has a "rescale" function which is needed in heatmaps 
library(RColorBrewer) # for convenience of heatmap colors, it reflects your mood sometimes
nba <- read.csv("http://datasets.flowingdata.com/ppg2008.csv")
head(nba)
nba$Name <- with(nba, reorder(Name, PTS))
nba.m <- melt(nba)
nba.m <- ddply(nba.m, .(variable), transform,rescale = rescale(value))
p <- ggplot(nba.m, aes(variable, Name)) + geom_tile(aes(fill = rescale),colour = "white") +  scale_fill_gradient2(low="#006400", mid="#f2f6c3",high="#cd0000",midpoint=0.5)

The nba data.frame has the folowing structure:

> head(nba)
            Name  G  MIN  PTS  FGM  FGA   FGP FTM FTA   FTP X3PM X3PA  X3PP ORB DRB TRB AST STL BLK  TO  PF
1   Dwyane Wade  79 38.6 30.2 10.8 22.0 0.491 7.5 9.8 0.765  1.1  3.5 0.317 1.1 3.9 5.0 7.5 2.2 1.3 3.4 2.3
2  LeBron James  81 37.7 28.4  9.7 19.9 0.489 7.3 9.4 0.780  1.6  4.7 0.344 1.3 6.3 7.6 7.2 1.7 1.1 3.0 1.7
3   Kobe Bryant  82 36.2 26.8  9.8 20.9 0.467 5.9 6.9 0.856  1.4  4.1 0.351 1.1 4.1 5.2 4.9 1.5 0.5 2.6 2.3
4 Dirk Nowitzki  81 37.7 25.9  9.6 20.0 0.479 6.0 6.7 0.890  0.8  2.1 0.359 1.1 7.3 8.4 2.4 0.8 0.8 1.9 2.2
5 Danny Granger  67 36.2 25.8  8.5 19.1 0.447 6.0 6.9 0.878  2.7  6.7 0.404 0.7 4.4 5.1 2.7 1.0 1.4 2.5 3.1
6  Kevin Durant  74 39.0 25.3  8.9 18.8 0.476 6.1 7.1 0.863  1.3  3.1 0.422 1.0 5.5 6.5 2.8 1.3 0.7 3.0 1.8

And here is the output. Nice isn’t it?


heat_map