Good news from Belgium: Course on Applied spatial modelling with R (April 13-14)

applied spatial

Within 2 weeks, our 2-day crash course on Applied spatial modelling with R (April 13-14, 2016) will be given at the University of Leuven, Belgium: https://lstat.kuleuven.be/training/applied-spatial-modelling-with-r
You'll learn during this course the following elements:

    The sp package to handle spatial data (spatial points, lines, polygons, spatial data frames)
    Importing spatial data and setting the spatial projection
    Plotting spatial data on static and interactive maps
    Adding graphical components to spatial maps
    Manipulation of geospatial data, geocoding, distances, …

applied spatial model

    Density estimation, kriging and spatial point pattern analysis
    Spatial regression

More information: https://lstat.kuleuven.be/training/applied-spatial-modelling-with-r. Registration can be done at https://lstat.kuleuven.be/forms/courses

New RStudio add-in to schedule R scripts

With the release of RStudio add-in possibilities, a new area of productivity increase and expected new features for R users has arrived. Thanks to the help of Oliver who has written an RStudio add-in on top of taskscheduleR, scheduling and automating an R script from RStudio is now exactly one click away if you are working on Windows.

How? Just install these R packages and you have the add-in ready at the add-in tab in your RStudio session. Select your R script and schedule it to run any time you want. Hope this saves you some day-to-day time and feel free to help make additional improvements. More information: https://github.com/jwijffels/taskscheduleR.

install.packages('data.table')
install.packages('knitr')
install.packages('miniUI')
install.packages('shiny')
install.packages("taskscheduleR", repos = "http://www.datatailor.be/rcube", type = "source")

taskscheduleR-rstudioaddin

 

taskscheduleR: R package to schedule R scripts with the Windows task manager

If you are working on a Windows computer and want to schedule your R scripts while you are off running, sleeping or having a coffee break, the taskscheduleR package might be what you are looking for. 

taskscheduleR-logo

The taskscheduleR R package is available at https://github.com/jwijffels/taskscheduleR and it allows R users to do the following:

i) Get the list of scheduled tasks

ii) Remove a task

iii) Add a task

    - A task is basically a script with R code which is run through Rscript

    - You can schedule tasks 'ONCE', 'MONTHLY', 'WEEKLY', 'DAILY', 'HOURLY', 'MINUTE', 'ONLOGON', 'ONIDLE'

    - After the script has run, you can check the log which can be found at the same folder as the R script. It contains the stdout & stderr of the Rscript.

 

Below, you can find an example how you can schedule your R script once or daily in the morning. 
library(taskscheduleR)
myscript <- system.file("extdata", "helloworld.R", package = "taskscheduleR")

## run script once within 62 seconds
taskscheduler_create(taskname = "myfancyscript", rscript = myscript,
schedule = "ONCE", starttime = format(Sys.time() + 62, "%H:%M"))
## run script every day at 09:10
taskscheduler_create(taskname = "myfancyscriptdaily", rscript = myscript,
schedule = "DAILY", starttime = "09:10")

## delete the tasks
taskscheduler_delete(taskname = "myfancyscript")
taskscheduler_delete(taskname = "myfancyscriptdaily")
  • When the task has run, you can look at the log which contains everything from stdout and stderr. The log file is located at the directory where the R script is located. 
## log file is at the place where the helloworld.R script was located
system.file("extdata", "helloworld.log", package = "taskscheduleR")

Who wants to set up an RStudio add-in for this?

Web scraping with R

For those of you who are interested in web scraping with R. Enjoy the slides of our presentation on this topic during the last RBelgium meetup. The talk is about using rvest, RSelenium and our own package scrapeit.core which makes scraping deployment, logging and replaying your scrapes more easy.

The slides below are in Flash so make sure you don't use an addblocker in order to view it.


 

If you are interested in scraping content from websites and feeding it into your analytical systems, let us know at bnosac.be/index.php/contact/mail-us so that we can set up a quick proof of concept to get your analytics rolling.

Web scraping with R & novel classification algorithms on unbalanced data

Tomorrow, the next RBelgium meeting will be held at the bnosac offices. This is the schedule.

Interested? Feel free to join the event. More info: http://www.meetup.com/RBelgium/events/228427510/

• 18h00-18h30: enter & meet other R users

• 18h30-19h00: Web scraping with R: live scraping products & prices of www.delhaize.be

• 19h15-20h00: State-of-the-art classification algorithms with unbalanced data. Package unbalanced: Racing for Unbalanced Methods Selection.

 

 

BNOSAC Blog feed

U bevindt zich hier: Home Blog