Western Governor’s Association: Tip of the spear: The horizon for drought data and technology

WGA Drought Data SliderboxPanel discusses drought monitoring, including the Soil and Climate Analysis Network, the Drought Monitor, and the federal Open Data Initiative, plus the latest on Google Earth Engine’s hydrological applications currently in development

The Western Governors Association (WGA), established in 1984, represents the governors of 19 western states and the U.S. flag islands.  The WGA works to develop and promote consensus-based policy solutions, to exchange information and identify best practices, to collect data and perform quality research, and to educate the public and other policy makers.

Webinar_-_March_11_Page_01In the spring of 2015, the WGA held a series of webinars addressing the issues and impacts surround drought in the West.  This webinar, Tip of the spear: The horizon fro drought data and technology, examined how scientists use data to understand drought and help policymakers anticipate dry conditions.

Participating in the webinar was Michael Strobel, Director, National Water and Climate Center, NRCS; Deke Arndt, Chief, Climate Monitoring Branch, Terry Fulp, Lower Colorado Regional Director, U.S. Bureau of Reclamation; and  Rebecca Moore, Engineering Manager, Google Earth Outreach & Earth Engine at Google.  The moderator was Tony Willardson, Executive Director of the Western States Water Council.

Here’s what they had to say.

Water related data information is critical when it comes to accounting for, assessing, and also managing drought and other water resources challenges,” began Tony Willardson. “Access to accurate and reliable data is fundamental to sound science and decision making, and while our knowledge with water data gathering and analysis and dissemination of that data continues to expand, in many cases, it’s not sufficient to meet our needs.”

The Western States Water Council with the support of the Western Governor’s Association has initiated a Water Data Exchange (WaDE), which will facilitate access to state information on water rights, water supply, water use, and water demand, he said. More information is available at www.westernstateswater.org.

Mr. Willardson then turned it over to the first speaker.

MIKE STROBEL, Director of the National Water and Climate Center

Mike Strobel gave a brief presentation about the snow survey, water supply forecasting, and soil climate analysis services offered by the National Resource Conservation Service, as well as a new initiative to establish a national soil moisture network.

Webinar_-_March_11_Page_03He presented a map of the U.S., showing the distribution of the monitoring networks. He noted that the green dots are manual snow courses located in the high altitude areas throughout the Western U.S. and up into Canada and Alaska; the blue dots are SNOTEL automated sites; and the red dots are the soil moisture (SCAN) sites which are nationwide.

Webinar_-_March_11_Page_04 The manual snow course measurements are a cooperative snow survey program that began over 100 years ago. “We work closely with many other agencies to cooperatively collect the data at the beginning of each month, from January through June, to collect snow depth and snow water equivalent measurements at over 1100 sites in the US, Canada, and up into Alaska,” Mr. Strobel said. “These are sites that we physically have to go to and with snow tubes, make manual measurements at these sites. Many of these sites have periods of records decades long, so they are very valuable records.”

Webinar_-_March_11_Page_05The SNOTEL network is an automated system which began in the late 70s; it is an automated system that throughout the 13 western states, as well as South Dakota and the Black Hills. “We have 885 sites that collect information in near real-time, so we transmit the information hourly, and not only do we collect information on snow depth and snow water equivalent, but we collect many other parameters.”

Mr. Strobel said the value of this is two-fold: The sensors transmit about 720 different measurements at each site every hour, and since it’s automated, it eliminates the need to send personnel up into the hazardous, high altitude, snow packed areas in the winter months. These sites are maintained in the summer months.

Webinar_-_March_11_Page_06The sites consist of a number of sensors: a snow pillow on the ground to measure the weight of the snow, a depth measurement, air temperature, wind speed and direction, solar radiation, relative humidity, and other factors which really help to understand what is happening at each of these sites from a climatic perspective, he said. About half of the SNOTEL sites are equipped to measure soil moisture and soil temperature; that part of the network is being expanded, because what happens in the soil is very relevant to what’s going to be the fate of the snow melt, come the spring time, he said.

Webinar_-_March_11_Page_07Mr. Strobel then presented a slide of the different examples of products put out by the National Water and Climate Center: A graphic distribution of snowpack, with both point maps and basin fill maps based on percent of normal for a 30 year period, as well as a forecast for spring and summer streamflow. “We improve this as we go through the year, but at the first of each month, we give an idea through many of our different tools of what our forecast will be for the water supply for the upcoming summer,” he said.

Webinar_-_March_11_Page_08The Soil Climate Analysis Network (SCAN) has 221 sites in 40 states and is critical for drought monitoring, he said. “We look at conditions in the soils and provide this information for folks that are looking at not just local agriculture, but for large spatial areas for looking at drought conditions, at soil conditions, and climatic conditions.”

Webinar_-_March_11_Page_09The National Soil Moisture Network is a collaboration of many federal, state, and university groups which is looking at taking measurements, remote sensing, and models, to come up with a single product that can be used for assessing drought and soil moisture conditions, he said.

Mr. Willardson asked Mr. Strobel how is soil moisture monitoring important in understanding drought severity and it’s potential impacts.

Mr. Strobel noted that they are monitoring in situ, with remote sensing and radar as well as the modeling itself. “We have different levels and approaches to looking at soil moisture and relating this then to drought conditions,” he said. “By having this integration of various methods of measuring soil moisture, it gives us a better understanding of the spatial variability in the soil moisture. I mean that in a way of looking at it across the country or across the region, but also with the in situ measurements. We’re not looking only at the surface but we’re also looking at the root zone and the areas that certainly impact vegetation, agriculture, hydrology and other aspects.”

For example, with the SCAN network, we have sensors at 2, 4, 8, 20, and 40 inches in depth, so this gives us not just what’s happening on the surface but a really a three-dimensional look at what’s happening with soil moisture,” he said. “This is really critical for applying this information and looking at drought conditions because it’s not just what’s happening at the surface, and it’s not just what’s happening in one small area or one region where we have observations, but really bringing these kinds of sources of information together.

The one area that I think we are lacking in when it comes to soil moisture, especially with the in situ measurements, is really having a period of record to compare it to,” Mr. Strobel continued. “We make measurements of soil moisture, but to look at it and how you compare it to what’s normal? What’s average at that site? A period of record like we have with the snow courses and other weather and climate data sources is a 30 year period we can compare it to, and we haven’t gotten there yet on most of our sites, and so that’s a scenario we need to see an improvement on if we’re really going to understand what’s happening with drought conditions.”

SCAN Network
Click here for more on the SCAN network.

Mr. Willardson asked Mr. Strobel about the Soil Climate Analysis Network (SCAN); when will it be active and the data available?

There are three aspects of it: there are these in situ networks, there’s the remote sensing, and then there’s the modeling, and many of these areas have multiple sources of information,” Mr. Strobel replied. “There’s many other sources of networks that are most focused on state level monitoring for soil moisture. The difficulty here is that we have different sources of information, different types of sensors, different ways of collecting the data, different levels of spatial distribution of these, but we are working on that. One part of this to test this integration of these different networks out by doing a pilot study, so in January, we started a small pilot study in the Texas Oklahoma area looking at using the information and start to integrate this into a single product. This will continue through this year, and we’ll hopefully expand that into a nationwide network here in future years.”

DEKE ARNDT, Chief, Climate Monitoring Branch, NOAA/NCDC

Webinar_-_March_11_Page_11Deke Arndt began by noting that he is the chief of the Climate Monitoring Branch at the National Center for Environmental Information, and that they are currently in the middle of a data center consolidation, so they are technically now one of the National Centers for Environmental Information.

We basically do the play by play of the climate system and we monitor what’s going on with the climate around the US and around the world,” he said. “But probably more importantly for this conversation, we contribute a few authors to the US drought monitor program, which are the weekly assessments of drought that are a shared responsibility among several federal and state agencies. We also host drought.gov, the webhome of the National Integrated Drought Information System, or NIDIS, which shares a long history of partnership and work with the WGA and many partners out in the West.”

Webinar_-_March_11_Page_12The drought portal is the web presence of NIDIS, and is at drought.gov, he said. “NIDIS itself is a multi-agency program shared among a handful of many agencies, including three that are on the call today,” Mr. Anrdt said. “It’s operationally led by NOAA but it’s owned by many agencies. The program offices are housed in Boulder, Colorado. Drought.gov itself and the services related to drought.gov are here in the monitoring branch in Nashville. The NCDC, the National Climate Data Center, also does climate monitoring, we speak to issues of water and drought and we work very closely with the folks in drought.gov to ensure that we’re keeping track of exactly what is going on.”

Mr. Arndt then gave a quick overview of drought.gov. “It is the US drought portal and it is designed to be a gateway to information that is most relevant for the type of user that is coming in,” he said. “One thing that NIDIS historically has done very well is that they’ve really understood from day one that drought means different things to different people and different communities even at different times of the year. They have always been very respectful of the fact that drought may mean something very different for a reservoir operator in California as it does to a corn grower in Iowa; they have vastly different sensitivities to time scales and the type of precipitation and when precipitation occurs.”

Drought.gov and NIDIS have worked to establish regional pilot programs to determine which datasets, data approaches, and presentations are the most valuable for a number of different regions, he said. “NIDIS in general and drought.gov in particular have historically been designed for people who are heavily involved in managing and very sensitive to drought,” he said. “It is recently come to take on a little bit more of the general public in an informational approach. … If you were to go there, you could find some of these regional pilots and see what folks in your part of the country have chosen to construct the data that they use to work with each other on managing water and drought. There’s a handful of informational tools as well.”

Webinar_-_March_11_Page_13One thing working with drought over time that becomes very apparent is that we are a nation of states, especially when it comes to drought,” Mr. Arndt said. “The way that we manage water, the way that we codify and make into law our management policies, we look like a nation of states in the agreements, the management and the professional and legal relationships between people who manage water that have been built up over a long time. So approaches and the datasets that we use have also been built up over a long time .”

It’s an important thing to remember that we not just understand scientifically about drought, but the things that we do as practicing professionals in managing drought have developed over a long time, and that will be part of those histories – the histories of the scientific development and the management development will part of that shaft,” he said. “We’re talking about the tip of the spear today, but there is a historical shaft behind that spear with a great deal of knowledge, and while new data opportunities are really exciting, the way that they are integrated with existing practices and data is going to be really important in how we improve our drought management as we go forward.”

Mr. Willardson asked Deke Arndt what gaps still exist in data collection to make products such as the drought monitor more robust and useful?

I think if you ask any scientist or any water manager, we would all say we want data up to the minute, we want it at the highest resolution possible, we want it to stretch back as far into history as possible, and we want it to be as accurate as possible, and that’s the holy grail of what a dataset or data product based on a dataset would offer,” responded Deke Arndt. “With the remote sensing technology, the radar and satellite information, we have really increased our ability to get higher resolution of information.”

Drought monitor
Click here for the latest drought monitor.

The one gap that would exist is how to attach relatively recent remote sensing and satellite observing technology, which is developed anywhere over the last five to 35 years or so, to the deeper record, which is 100 – 135 years long and is the data upon which our scientific understanding and many of our management practices are built on,” Mr. Arndt continued. “I think that’s a real challenge and opportunity. Everybody is working on how to splice this really rich historical traditional data like the snowcourse data Dr. Strobel was talking about with satellite imagery that provides more frequent or higher resolution about snow but really lacks the ground truthing accuracy. At NCDC here, we have a program called the climate data records program that’s trying to do exactly that. How do we stitch remotely sensed data onto the longer term record and even remotely sensed data from the different satellites that have provided it over time together. We won’t solve that by ourselves.”

I think it’s a big challenge to benefit the most from the rich resolution available through remotely sensed data, but also account for some of the discontinuities and some of the disconnect between the era in which that satellite data existed and our deeper understanding,” he said. “I’m from Oklahoma. You don’t talk about drought in Oklahoma unless you understand what happened in the 1930s, and in the 1950s. Many folks in America and around the world understand that their management practices were really informed by situations that happened two, three generations ago. So I think that’s a neat opportunity, and it will generate a lot of research, public, private, academic sectors. I think it’s a nice opportunity for us to get a lot smarter in how we deal with drought.”

Some of the technology is even putting sensors on cell phones,” said Moderator Tony Willardson. “Is crowd sourcing going to be a viable option now into the future to supplement some of our data institute measurements on the ground?

Crowdsourcing has actually been a solution proposed and executed several groups around the world to help get handwritten data off of ship logs and old weather reports and get them into digital archives where smart, innovative people can do a lot more about it,” replied Mr. Arndt. “I won’t go as far as the observing networks, but I know just from a liberating data standpoint, it’s a really valuable approach.”

Mr. Willardson then asked Deke Arndt to discuss the usefulness of the drought monitor.

Mr. Arndt said that the drought monitor arose as a result of the drought community’s conversations with people that are sensitive to drought around the country, and what arose was a single one-shot assessment of the drought situation around the US. “It’s sensitive to two very generic time scales of drought,” he said. “One would be short-term, which is drought that develops over weeks to months. This tends to be agriculture and wildfire management communities. Horticulture – those tend to be really sensitive to drought on the short-term. Soil moisture depletes, things get bad in those areas.”

Long-term would be more groundwater, some types of reservoir management, things about the water supply, and water quality when you’re talking about long-term drought, so what it is is a weekly assessment of a number of different datasets called drought indicators … some of them have been tuned and blended to be a little more sensitive to the short-term and some to be blended on the long-term,” he continued. “Every week there is an actual person responsible as the editor in chief of the final assessment of the drought monitor. The important and great and innovative thing is that it is collaboratively built, both in the long-term and even each week, so there are hundreds of local experts around the country who will provide advice and insight on conditions that aren’t necessarily able to be measured by the tools that we have now. … We are putting those together with a great deal of insight from the states and the communities out there, and the federal agencies that have to manage resources as well, especially in the west.”

TERRY FULP, Regional Director of the Lower Colorado Region for the Bureau of Reclamation

Terry Fulp began by first giving an overview of what the Bureau of Reclamation does, and then he would focus on their use of data and technology in accomplishing that mission.

Webinar_-_March_11_Page_15 He began by presenting a map of the Colorado River Basin, noting there is a geopolitical line that separates the upper basin from the lower basin which was established with the 1922 compact between the 7 basin states. “We’re sitting in the Lower Basin, our headquarters are at Boulder City, Nevada, which is the home of Hoover Dam, and we manage that lower basin, the area essentially from Lake Mead down to Mexico,” he said.

The basin as a whole is over-allocated based on what we’ve seen over this last 100 years of inflows,” he said. “However, we’re not yet overused, primarily because the upper basin states have not fully developed their entitlements, and I think we all are concerned for the future as development does continue basin wide.”

Mr. Fulp said there are many operational challenges but the unique part of their system is that they can store roughly four times the annual average inflow into the system. “It’s a unique part of our system in that sense in that we can buffer ourselves against drought because of that large amount of storage,” he said. He also noted that the Secretary of the Interior is essentially the watermaster of this part of the Colorado River.

His office has the responsibility to operate and maintain the last nearly 700 miles of the Colorado River. “What we mean by that, all users have a contract to take water; we account for all water use that’s used in that, and we also schedule and deliver all the water to those turnouts for our customers to take, so we deliver about 7.5 MAF to those three lower division states, Arizona Nevada and California, plus another 1.5 MAF to Mexico,” he said. “Data is extremely important to us to do that, and so we have a large investment in a data network as well as in other technology.”

Mr. Fulp also pointed out that this is the lowest 15 year period we’ve seen in our 100+ years of record keeping. “If we look at 1200 tree ring reconstructions, it certainly one of the worst 15 year periods we’ve seen in 1200 years. Probably the piece that’s really of concern for the future is that the look from climate models says that this drought we see today might be in 20-25 percentile of the worst periods we might see in the future, so there is a lot of work going on basin-wide in terms of promoting conservation and other activities to stretch these water supplies as far as they can go.”

Currently our system, a system that can store about 60 MAF of water is about half full, 49% full, that’s about where it’s been over the last ten years. During this drought, we were very fortunate when we entered the drought that we had the reservoirs almost completely full.   That’s of course never guaranteed in these prolonged drought.

This year looks to be another below average year. Our current snowpack is 80% of average and our inflow forecast is about 79% of average.

Webinar_-_March_11_Page_16The center of the data center decision support system is a relational database, and from that database our information is served, he said. “To look at the real time data and collection of storage, we have a very large data collection network with the USGS, as well as some of our own gauges,” he said. “More than 100 gauges are operated and maintained by our two agencies over our area here in the lower basin. About a quarter of those report on a 15 minute interval; others report on at least daily. They assist us in tracking water flow as well as all the diversions that folks are taking off the river, and the return flows, which are extremely useful in meeting downstream deliveries.”

When we look at water use projections based on all that real time network, we are able to track approximately 98% on a daily basis of all the water deliveries of the 9 MAF of water that is delivered to the lower basin states and Mexico,” he said. “It is a well used website by all of our water purveyors; they track this daily to make sure they stay within their entitlement, which is an annual entitlement.”

We also do long, mid and short-term projections, both for operational and planning purposes,” he added.

Hoover Dam, photo by Bureau of Reclamation
Hoover Dam, photo by Bureau of Reclamation

Mr. Fulp said the website is how the information and knowledge is disseminated. “This is an area that we have continued to work on over time and this is an area we continue to want to expand,” he said. “Certainly all of our customers are very familiar with what products we put on our website and help us design new ones, but more recently, we’ve been involved in open water data activities within the Department of Interior. The president’s executive order called for the federal government to make open machine readable, as it was called, the new default state for government information. We’re doing a pilot down here to further improve our visualization and improve the ability of others to use our data.”

Mr. Willardson asked Terry Fulp and asked him about the Open Data Initiative and sharing Reclamation’s data with the public and users in decisions looking at drought conditions. Why is this happening now?

We serve a lot of data out on our website, but it’s not really of a very interactive nature, and it also is a little bit of ‘inside baseball’ type of stuff in that it’s really directed more for our customers and our direct users of water who understand all the subtleties and complexities,” replied Mr. Fulp. “I think part of what we’re trying to do with the Open Data Initiative is to make it much more clearer and easier for folks to access and understand so that other people outside of our direct customers will be able to benefit from accessing that data, and maybe doing their own analysis of it.

Mr. Willardson then asked Mr. Fulp as a reservoir operator, what tools does he use and how are they relevant to making on the ground decisions?Mr. Fulp referenced Mr. Arndt’s earlier comment about the impact of drought dependent on who you are and where you are. “Our example down here, when we’re managing this lower basin system, we’re at the bottom of the system and we have this benefit of this large amount of storage, and so something like the drought monitor, although very interesting to us and something we do look at, we don’t really utilize that in our decision making because the water supply is to us mostly what we have in storage,” he said. He noted that over 90% of the water in the basin is generated in the upper basin, so they are very dependent upon reservoir storage.

Real-time data collection is really crucial to our management down here, and we are measuring streamflows and diversions,” Mr. Fulp continued. “We’re measuring return flows – return flows are a big part of most western water systems to help meet users downstream. All of that’s in a very detailed priority system. Having real time data is important but also having a long time of consistent data recording is also important. Things like soil moisture are also very valuable in terms for forecasters who are doing the inflow forecasts that we all use on the Colorado River system, so knowing current soil moisture helps get better projections of what we might get in runoff when we see a snowpack. All that is predicated on having long records, and I can’t over emphasize the importance of continuing all of our funding to make sure that our data is continually collected in a consistent manner at the right spatial and temporal time scales.”

Lake Powell BOR
Lake Powell, by Bureau of Reclamation

Mr. Willardson then asked about inter-agency collaboration, and noting that it has been unprecedented with this current drought. What do you see as the future for this kind of interagency collaboration?

Certainly on Colorado River, I can say that collaboration has been the foundation of every major policy decision we’ve made over the last 15 years,” replied Mr. Fulp. “These are really complex issues and problems we’re dealing with. Drought is the headliner of those, and they really only get solved through that collaborative approach in my opinion. What we really all are striving for, I believe, is to avoid court litigation and final court decisions that may in fact not be the best decision because of all those complexities. Much better decisions are made by the folks who are really affected and know what’s going on. For us here in the lower basin, I think we will absolutely continue that and even thoughout the upper basin as well. On the Colorado River, this is how it works.”

Mr. Willardson asked from a water management perspective, what key hydrologic data are we missing to help us better manage our water supplies during a drought?

Terry Fulp said the getting a better handle on antecedent water conditions to improve projections or forecasts of what’s going to come off, especially in snow-driven systems is one of those areas we need to focus on. “That, and I would just reiterate that we can’t drop the basic data stuff that we’ve been collecting for years, and that includes water levels at reservoirs and rivers as well as temperature and a lot of the other data that would go into those evapotranspiration- type things, so all of this to me just says we have to be sure we continue what we have, and figure out what else we do need, and fund it appropriately.”

REBECCA MOORE, Engineering Manager, Google Earth Outreach & Earth Engine at Google

Rebecca Moore began by saying that with respect to drought, there is mature science; there are extensive databases that have been developed and sophisticated decision support tools, and Google is ‘the new kid on the block’ coming into the field. “There are still some gaps with respect to access to data, freshness of the data, and there are still opportunities to make improvements in those tools, so I’m going to change speak about a new technology platform that we’ve developed over the last few years called Google Earth Engine, which is for global scale environmental modeling.”

She noted that the first applications were in forestry, but there are promising applications being developed now for water resources and drought monitoring.

Webinar_-_March_11_Page_19Ms. Moore began with the story of how Google Earth Engine was born in 2008 in the Brazilian Amazon. “We were approached by scientists in the Amazon; they were seeing deforestation at the rate of more than a million acres a year and that deforestation could be observed from satellite imagery from space,” she said. “They had the science to do the analysis and virtually detect deforestation from that data coming in every day, but the problem was it was terabytes or even pedabytes of remote sensing data, and when they tried to run that deforestation analysis on a single computer, it would take weeks. So they approached us to see if Google could build something new.”

Google Earth and Maps are not quite appropriate for this problem, because while they allow you to look at satellite data, or do navigation, they don’t allow you to do science or to detect change or to map trends on the changing surface of the planet,” she continued. “So it seemed like a Google scale problem to try to address this type of environmental modeling.”

Webinar_-_March_11_Page_20There’s a tremendous amount of publicly available satellite data,” she said, presenting a slide showing the NASA satellites currently in orbit. “There’s a treasure trove of information about land, water, atmosphere, and ocean.”

Webinar_-_March_11_Page_21One excellent example that is very relevant for drought monitoring and evapotranspiration is the LANDSAT satellite,” she said. “It’s the longest continuously operating earth-observing mission; it’s been going for more than 40 years. It collects not just pretty pictures of the earth, but multi-spectral data that includes infrared and thermal bands. It covers the whole world every 16 days, and it’s now collected millions and millions of images. It’s a fantastic resource.”

Webinar_-_March_11_Page_22However, this is where it resides,” she said. “The data from the LANDSAT satellites and many other satellites tend to come off the satellite and go onto tapes in a vault in government archives, so the data is quite secure, which is excellent, but it’s not that accessible for doing analysis, particularly if you’re trying to do timely global analysis.”

Webinar_-_March_11_Page_23 “So that was our starting point: to liberate some of these massive big data earth observation datasets that are really critical for doing global earth monitoring and bring them online into Google data centers,” she said.

Webinar_-_March_11_Page_24And then into the platform called Earth Engine,” she said. “We call it Earth Engine because it’s an analytical engine for the planet, and the goal is to derive information from rastor and vector data at scale.”

Ms. Moore then showed a timelapse animation of Las Vegas, noting that you can see Las Vegas growing while Lake Mead shrinks:

Webinar_-_March_11_Page_26Ms. Moore noted said they built a timelapse animation of the whole planet which is available at EarthEngine.google.org/timelapse. “We have the entire earth at 30 meter resolution for almost 30 years, and you can go anywhere and see planetary change, such as ice caps receding, glaciers receding, urban development, and water bodies changing,” she said. “The thing that was key about it was that 2 million LANDSAT scenes were required to be analyzed – almost a pedabyte of data. It required 2 million hours of computation, but because we ran that over 66,000 computers in parallel, we had the result in a day and a half, whereas it would have otherwise taken almost 300 years.”

Webinar_-_March_11_Page_27 The platform was built fundamentally for science, with the first major scientific result with Professor Matt Hansen at Univeristy of Maryland to create the first detailed maps of global forest cover and change from 2000 to 2012, which was published in Science Magazine in 2013, she said.

Webinar_-_March_11_Page_28I wanted to tell you the story about the forestry case because we think it’s very relevant for water,” she said. “Our goal is to turbo-charge the best science and help drive it into operational use on near real-time data, and to make that information accessible to everyone – to all the decision makers who need access to it.”

As an example here, within just a couple of weeks of that global forest cover map and dataset being ready, a new application called Global Forest Watch was launched that is powered by Earth Engine, and it is doing near real-time updating of the state of the world’s forests,” she said.

So while the first couple years of Earth Engine were really focused on forestry, 2015 and forward, we are very focused on water resources, drought, and flooding, she said, giving some examples of applications being built science partners at Univeristy of Idaho, University of Nebraska, and Desert Research Institute.

Webinar_-_March_11_Page_30She then presented a slide looking at the evapotranspiration reference fraction for the Palo Verde Irrigation District that was computed using Google Earth Engine, noting that they have moved the METRIC algorithm onto Earth Engine in order to accelerate the frequency with which the results can be produced, taking it to global scale, and making it freely available. “The challenge with this data today is there’s an asymmetry in who has access to this information from METRIC; you need to be some what of an expert to work with it, so the idea is to make this kind of information about evapotranspiration, water consumption by crops, much more accessible,” she said.

Webinar_-_March_11_Page_31What you are seeing here are two slides,” she said. “In the background is the evapotranspiration algorithm EEFLUX running inside Google Earth Engine, and this overlay is an example of the use of this data by Gallo Winery to monitor crop water use.”

Webinar_-_March_11_Page_34Another application under development with the goal of making information about drought and climate more accessible is called CLIME, she said. “You come into a simple web interface, and you can choose from a variety of indices and datasets and derive datasets that are produced on the fly in Earth Engine based upon fresh data,” she said.

Webinar_-_March_11_Page_36She then presented a sample of the Palmer Drought Severity Index for the US. “It’s not that this type of index hasn’t been available before, but our goal with these science partners is to turbo charge their work to essentially,” she said. “They shouldn’t have to worry about the infrastructure of managing the data coming in every day, having access to thousands of computers to do really quick analysis, and turning around that into a very accessible web hosted application. We’re trying to solve some of those infrastructure challenges to really transform access to this type of data.”

Tony Willardson then asked Rebecca Moore about the efforts the Google is making to expand inclusion of hydrologic data with the satellite imagery.

We did start more with the satellite imagery and the datasets that were required for forest monitoring, but now we are bringing in a number of NOAA datasets, weather datasets, and some hydrology-related datasets, such as watershed boundaries, flow accumulation layers,” Ms. Moore replied. “We have partners that are working on analyzing flood risk; Rick Allen’s group is doing evapotranspiration; and even actually quantifying river migrations, so we do see water monitoring, management, drought forecasting as a critical area for us. We crowdsource from the scientists; they vote on what we should be making sure what’s in earth engine, and we are seeing a lot of interest in hydrology datasets so we are prioritizing those.”

Help fill up Maven’s glass!

Maven’s Notebook remains only half-funded for the year.

Click here to find out how you can help