19 Feb

Hi Gigis,
Have you never look for a simple way to color a raster without manualy modifying the symbology or use a python/r script to build a RAT?
Here is the solution for lazy gis people like me đ
My example is a raster Geotif with 0 for land, 1 for water and 255 for no data. Most of the time, when you open it in a GIS, you have black overview because the contrast is adjusted on the 255 value. Here is how to proceed:

1/ Write a simple text file colortable.txt such as:
0 white
1 blue
255 grey
2/gdaldem color-relief -of VRT sourcefile.tif colortable.txt out.vrt destfile.tif

3/ Just open the VRT, and the magic is there!

Khan-Guru, the Raster Magician

27 Mar

Hi GIS fellows ! Here is a spring present, let’s see how to simply create an hillshade based on the acquisition date and time !

```sunPosition <- function(year, month, day, hour=12, min=0, sec=0, lat=46.5, long=6.5) {   twopi <- 2 * pi deg2rad <- pi / 180   # Get day of the year, e.g. Feb 1 = 32, Mar 1 = 61 on leap years month.days <- c(0,31,28,31,30,31,30,31,31,30,31,30) day <- day + cumsum(month.days)[month] leapdays <- year %% 4 == 0 & (year %% 400 == 0 | year %% 100 != 0) & day >= 60 & !(month==2 & day==60) day[leapdays] <- day[leapdays] + 1   # Get Julian date - 2400000 hour <- hour + min / 60 + sec / 3600 # hour plus fraction delta <- year - 1949 leap <- trunc(delta / 4) # former leapyears jd <- 32916.5 + delta * 365 + leap + day + hour / 24   # The input to the Atronomer's almanach is the difference between # the Julian date and JD 2451545.0 (noon, 1 January 2000) time <- jd - 51545.   # Ecliptic coordinates   # Mean longitude mnlong <- 280.460 + .9856474 * time mnlong <- mnlong %% 360 mnlong[mnlong < 0] <- mnlong[mnlong < 0] + 360   # Mean anomaly mnanom <- 357.528 + .9856003 * time mnanom <- mnanom %% 360 mnanom[mnanom < 0] <- mnanom[mnanom < 0] + 360 mnanom <- mnanom * deg2rad   # Ecliptic longitude and obliquity of ecliptic eclong <- mnlong + 1.915 * sin(mnanom) + 0.020 * sin(2 * mnanom) eclong <- eclong %% 360 eclong[eclong < 0] <- eclong[eclong < 0] + 360 oblqec <- 23.439 - 0.0000004 * time eclong <- eclong * deg2rad oblqec <- oblqec * deg2rad   # Celestial coordinates # Right ascension and declination num <- cos(oblqec) * sin(eclong) den <- cos(eclong) ra <- atan(num / den) ra[den < 0] <- ra[den < 0] + pi ra[den >= 0 & num < 0] <- ra[den >= 0 & num < 0] + twopi dec <- asin(sin(oblqec) * sin(eclong))   # Local coordinates # Greenwich mean sidereal time gmst <- 6.697375 + .0657098242 * time + hour gmst <- gmst %% 24 gmst[gmst < 0] <- gmst[gmst < 0] + 24.   # Local mean sidereal time lmst <- gmst + long / 15. lmst <- lmst %% 24. lmst[lmst < 0] <- lmst[lmst < 0] + 24. lmst <- lmst * 15. * deg2rad   # Hour angle ha <- lmst - ra ha[ha < -pi] <- ha[ha < -pi] + twopi ha[ha > pi] <- ha[ha > pi] - twopi   # Latitude to radians lat <- lat * deg2rad   # Azimuth and elevation el <- asin(sin(dec) * sin(lat) + cos(dec) * cos(lat) * cos(ha)) az <- asin(-cos(dec) * sin(ha) / cos(el))   # For logic and names, see Spencer, J.W. 1989. Solar Energy. 42(4):353 cosAzPos <- (0 <= sin(dec) - sin(el) * sin(lat)) sinAzNeg <- (sin(az) < 0) az[cosAzPos & sinAzNeg] <- az[cosAzPos & sinAzNeg] + twopi az[!cosAzPos] <- pi - az[!cosAzPos]     el <- el / deg2rad az <- az / deg2rad lat <- lat / deg2rad   return(list(elevation=el, azimuth=az)) }       # calculate the sun position from date and time position<-sunPosition(2012, 05, 25, hour=15, min=40, sec=42.460,lat=d_wgs@coords[2], long=d_wgs@coords[1])   system(paste('gdaldem hillshade', '-of GTiff', '-az',position\$azimuth, '-alt',position\$elevation, paste('/path_to_data/dtm_file','.tif',sep=''), paste('/path_to_data/dtm_file','_hillshade.tif',sep='')))```
20 Feb

Tired of using ArcGis to merge a huge number of shapefiles?

Here is a simple way to use ogr2ogr and merge shapefiles, so easy!

```  list_files<-Sys.glob('/path_to_shapefiles/*.shp')   #create destination file (copy of the first file to merge) merged_file<-'/path_to_my_results/merged_shapefile.shp' system(paste( 'ogr2ogr', merged_file, list_files[1]))   # update the merged_fille by appending all the shapefiles of the list list_files for (i in seq(2,length(list_files))){ system(paste('ogr2ogr', '-update', '-append', merged_file, list_files[i] )) }```
13 Jan

Need a HR satellite image from Google Map for a figure?
Here is an easy way!

```  library(dismo)   # enter the address here x <- geocode('Croix du Sud 2,1348 LouvainâlaâNeuve') e <- extent(as.numeric(x[5:8])+ c(-0.001, 0.001, -0.001, 0.001)) g <- gmap(e, type = "satellite") plot(g)```

03 Dec

His GISguys!

Here is a simple example of how to use the very fast multi-thred gdal_calc to get NDWI with stats, histogram and overviews.

```file='my_input_file.tif' outfile='my_output_file.tif'   system( paste('gdal_calc.py', '-A',file, '--A_band=2', '-B',file, '--B_band=4', paste(' --outfile=', outpufile,sep=''), paste(' --calc=','\"','(A.astype(float)-B)/(A.astype(float)+B)','\"',sep=''), paste('--type=','\'','Float32','\'',sep=''), paste('--co=','\"','COMPRESS=LZW','\"',sep='')))   system(paste('gdalinfo -hist -stats',outpufile))   system( paste('gdaladdo', outpufile, '2 4 8 16')```

Here is another version with byte file as output (values ranging from 0 to 255, faster and smaller files):

```file='my_input_file.tif' outfile='my_output_file.tif'   system( paste('gdal_calc.py', '-A',file, '--A_band=2', '-B',file, '--B_band=4', paste(' --outfile=', outpufile,sep=''), paste(' --calc=','\"','((((A.astype(float)-B)/(A.astype(float)+B))+1)*128)','\"',sep=''), paste('--type=','\'','Byte','\'',sep=''), paste('--co=','\"','COMPRESS=LZW','\"',sep='')))   system(paste('gdalinfo -hist -stats',outpufile))   system( paste('gdaladdo', outpufile, '2 4 8 16')```
27 Nov

Hi GisWorld,

Since GDAL 1.11, a driver to read the ESRI geodatabase (.gdb) is available (http://www.gdal.org/drv_filegdb.html).
It is thus possible to load the database in R:

``` library('rgdal') myFeatureClass<-readOGR('/path_to_gdb/myGDB.gdb',layer='myFeatureClass')```
24 Oct

Hi GIS planet,

I have 4000 separeted geoTIFF images of 8000 by 8000 pixels and I want to mosaic and clip those images based on polygons from a shapefile (polygons are 20000 by 20000 pixels).
Here is the code I used:

```    ## 1 Build a VRT   system( paste('gdalbuildvrt', '/path_to_images/LargeImage.vrt', '/path_to_images/*/*.tif') )     ## 2 Clip & Merge the files based on polygon delimations from a shapefile zones <- readOGR('/path_to_images/Zones.shp','Zones')   for (i in 1:length(zones@polygons)) { # get the polygon upper left and down right of the polygones ulx<-zones@polygons[[i]]@Polygons[[1]]@coords[2,1] uly<-zones@polygons[[i]]@Polygons[[1]]@coords[2,2] lrx<-zones@polygons[[i]]@Polygons[[1]]@coords[4,1] lry<-zones@polygons[[i]]@Polygons[[1]]@coords[4,2]   system( paste('gdal_merge.py', paste('-o /path_to_images/LargeImage_Zone_',toString(i),'.tif',sep=''), '-ul_lr' ,toString(ulx), toString(uly),toString(lrx), toString(lry), '/path_to_images/LargeImage.vrt')) system( paste('gdaladdo', paste('/path_to_images/LargeImage_Zone',toString(i),'.tif',sep=''), '2 4 8 16')) }```
22 Oct

With QGIS, you can now conncet to an ESRI GDB through the ‘Add vector layer’ tool.
Just select the properties as specified on the figure below.

14 Oct

This post shows some examples of the functionÂ Â getData into the Raster package. This functionÂ Â allows to directly download data from 3Â global datasets (Administrative borders,Â Altitude,Â Climate). This will be shown for a country (Belgium). The code is below.

Administrative bordersÂ  (from GDMA dataset) is available at 5 country sub-levels

Altitude (from the SRTM dataset) at 90m of resolution

Climate (available from worldclim at 0.5 minute of degree corresponding to 5.5 km of spatial resolution in Belgium)

Mean precipitations by monthÂ  Â  Â Annual mean temperature