Best Best Practices Ever!

Every once in a while I read something that is so insightful, so clearly written and so well documented that it enters my own personal pantheon of “Best Ever” documents. I recently added a new, simply divine article titled Best Practices for Scientific Computing and hope that everyone reading this post also takes the time to read that article. I’m including the outline here only to encourage you to read the article in it’s entirety.  It is extremely well written.

Continue reading

When k-means clustering fails

Letting the computer automatically find groupings in data is incredibly powerful and is at the heart of “data mining” and “machine learning”. One of the most widely used methods for clustering data is k-means clustering. Unfortunately, k-means clustering can fail spectacularly as in the example below.

Continue reading

Optimizing Data Access – Know your Hardware

The Library of Congress has a lot of information — hundreds of millions of pages of books and manuscripts. But no one has ever suggested that we store all of that information in a single, billion-page book. Instead, individual books are stored on shelves in stacks in rooms according to an organized system. Managing large datasets is just the same:  data should exist in manageable sized files stored in hierarchically organized directories. Unfortunately, many people working with large datasets try to do just the opposite. This post describes how converting thirty 200Gb files into three million 200Kb files reduced data access times from several hours to under a second.

Continue reading

Data Management Questionnaire

Sometimes merely filling out a questionnaire can cause you to think about problems in a new way.  When asked to answer a question that has never occurred to you before, you may find yourself reevaluating some of your core assumptions — assumptions you may not have known you had.  That is the power of asking questions. Our data management questionnaire poses questions in 12 categories that will help you figure out what you need, what you want, and perhaps give you a hint of how to get there.

Continue reading

Standard Country Names

What’s in a name?  That which we call a rose
By any other name would smell as sweet.

Ahhh love.  Juliet speaks lovely poetry but we learn, as the story unfolds, that names and the identification they impart are in fact extremely important.  This is no less true in data management where country names are anything but standardized.

Continue reading

Methow Valley Air Quality

Mazama Science has released a new set of tutorials demonstrating the use of air quality R packages to investigate data from regulatory monitors and low-cost sensors. This post is just a short summary of what the tutorials cover. We invite anyone interested in wildfire smoke and air quality to run through the tutorials and provide feedback.

Continue reading

Qualitative Display of Air Quality Data

Graphical excellence is that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space.

Edward Tufte, The Visual Display of Quantitative Information

This post briefly summarizes our thoughts on best practices for designing public-facing data graphics for air quality data. Focus will be on the types of charts we feel are appropriate to use with data (e.g. from low-cost sensors) that may not be as accurate as data collected by monitors using Federal Regulatory or Federal Equivalent Methods (see FRMs/FEMs and Sensors). Visualization types discussed will include:

  • maps
  • time-series charts
  • calendars
  • status and forecast tables
Continue reading

Cross-origin requests with beakr

beakr is a lightweight and flexible web framework that allows you to incorporate R code as the Middleware responsible for handling web requests. At Mazama Science, we developed beakr to simplify the process of creating R-based web services that we use to deliver a variety of products: data files, images, rendered Rmarkdown documents, etc.

In this article, we discuss using beakr to set a CORS header and create an example beakr instance that can respond to cross-origin javascript requests.

Continue reading

MazamaSpatialUtils R package

Version 0.7 of the MazamaSpatialUtils is now available on CRAN and includes an expanded suite of spatial datasets with even greater cleanup and harmonization than in previous versions. If your work involves environmental monitoring of any kind, this package may be of use. Here is the description:

A suite of conversion functions to create internally standardized spatial polygons dataframes. Utility functions use these data sets to return values such as country, state, timezone, watershed, etc. associated with a set of longitude/latitude pairs. (They also make cool maps.)

In this post we discuss the reasons for creating this package and describe its main features.

Continue reading