BAY DELTA SCIENCE CONFERENCE: Managing the state and federal water projects: The Collaborative Science and Adaptive Management Program

Part 1: Collaborative Science and Adaptive Management Program: Moving from Litigation to Collaboration

The Collaborative Science and Adaptive Management Program (CSAMP) is an applied science program specifically designed to inform decisions regarding operations of the State Water Project and the Central Valley Project and species protection in the Delta. The Program was established in 2013 as an outgrowth of litigation and is intended to provide an alternative where parties can work together to address critical uncertainties and promote common understanding.  At the Bay Delta Science Conference, Bruce DiGennaro with the Essex Partnership explained how the program works.

Since the Collaborative Science and Adaptive Management Program (or CSAMP) is an outgrowth of litigation over the salmon and Delta smelt biological opinions, the process is focused on the south Delta, the export facilities, and some of the science issues that were at the base of that litigation.

Mr. DiGennaro explained how CSAMP came into being.  The program was initiated in 2013 when the parties involved in the litigation went to the judge and asked for more time to work outside of court to try and resolve some science questions.  They began the process with annual check-ins with the judge. “Both cases were on appeal to the Ninth Circuit, who ruled in favor of the agencies largely, but all the parties agreed at that time that we had something good going on; and so we’ve continued although we’re no longer under the authority of a court or a judge,” Mr. DiGennaro said.  “In fact, we’re under no authority what so ever.  It’s a strictly voluntary endeavor.”

The list of participants includes state and federal resource agencies, the Bureau of Reclamation, the Department of Water Resources, the NGOs involved in the lawsuit as well as others representing other environmental interests, and the public water agencies, particularly south of Delta water users.  There has been interest in more people getting involved as the issues grow so that’s a possibility, he said.

Mr. DiGennaro pointed out that the concept of moving from litigation to collaboration is not a simple thing.   “This is pretty significant paradigm shift; you’re going to take people that have been in court really beating each other up for a while, and try to move them to working side by side,” he said.  “You’re moving from an adversarial position to a collaborative, and that’s a paradigm shift.  There’s a legacy of mistrust because of the litigation, so now you need to build that trust back up.  It doesn’t happen overnight; it takes some time.”

It makes sense that science be the basis for that collaboration,” he continued.  “We all want sound science, the parties agree on that, and science can be a unifying platform that we can move forward on.  We’re not going agree on everything, but we can at least do that in a civil discourse.”

There are a number of things that make CSAMP different.  One of them is that the program embodies some of the concepts of a ‘skunkworks’, a  term that refers bringing a small group together to force rapid innovation by removing organizational boundaries.  “I’m not going to say we’re fully a skunkworks but some of the concepts apply here,” Mr. DiGennaro said.  “We’re all about applied science, or what kind of information would help decision makers, particularly relative to these decisions we have to make about operations of the pumps.  We’ve chosen, because of the nature of where we came from, to be involved in very controversial issues, so we’ve agreed that that’s what we’re going to do, and that’s what we’re going to focus on.  In a lot of ways, this is about communication.  We’re a forum for dialog that is now leading to common understanding.”

The Delta is a very complex system; not only the ecosystem is complex but also the institutional governance system is complex, which makes it evermore challenging, he said.  “We have decision makers, the managers who are making more day to day decisions and running the operation, and then there’s scientists doing work to try to support this operation,” he said.  “But it’s not quite that simple.  It’s a whole lot of different organizations that are making different kinds of decisions on different time scales.  Similarly, there’s a lot of different managers that are operating, and similarly there are scientists from different organizations that might or might not talk to each other, so it starts to get compartmentalized.”

I would argue that we do pretty well at communicating horizontally with our peer groups,” he said.  “The folks at the top here, the vast majority, are schooled in very different concepts.  They have MBAs, they studied organizations … so they’re pretty good at talking to each other [horizontally].  What I would argue is that we don’t talk to each other very much going [vertically].  Now the managers are probably talking to their respective scientists, but I would argue that sometimes things get lost in the middle, and maybe those conversations are happening, but it doesn’t always happen that these folks are directly talking to each other, and if they do, they don’t even understand each other.  Then you get these silos; maybe they are talking to each other, but they are all in these silos.”

What they are trying to do is move information up and down the groups.  “It is a double headed arrow for a reason,” he said.  “The scientists really need to understand what is it the managers are trying to make decisions about so they can help.  The managers need to understand what the information is being developed down here that can help them make that decision.  So we’re trying to break that down in a lot of different ways through things like skunkworks and things like that.  But this conceptually is what we’re trying to accomplish in CSAMP.”

CSAMP has a two-tiered organizational structure comprised of a Policy Group made up of agency directors and top-level executives from the entities involved in the litigation, and the Collaborative Adaptive Management Team (CAMT) which includes designated managers and scientists representing state and federal agencies, water contractors and non-governmental organizations to serve as a working group functioning under the direction of the Policy Group.  The Delta Science Program and the Interagency Ecological Program Lead Scientist is involved as well.  Technical teams are formed as needed.  “Depending on the need, we form a team, we pull together the experts, and a lot of times we bring in other experts where we need to, and we retain experts to do independent studies,” he said.

The mission statement for CSAMP is:  ‘Work, with a sense of urgency, to develop a robust science and adaptive management program that will inform both the implementation of the current Biological Opinions, including interim operations; and the development of revised Biological Opinions.’

Mr. DiGennaro pointed out that they are trying to work with a sense of urgency.  “We’re trying to be responsive to management needs,” he said.  “We’re looking at the current biological opinions but also trying to think about future biological opinions, whether those might be under Water Fix, there’s a reconsultation now.  But we are pretty focused on the Delta, the biological opinions, management of salmon and smelt.

Currently, the technical investigations that are underway include the application of Delta smelt survey data, fall outflow management for Delta smelt, Old and Middle River Management and Delta smelt entrainment, and south Delta salmonid survival.

Part 2: Collaborative Adaptive Management Team (CAMT) Investigations: Using New Modeling Approaches to Understand Delta Smelt State Salvage Patterns at the State Water Project and Central Valley Project

The Collaborative Adaptive Management Team, comprised of high level managers and senior scientists, is the group that works underneath the CSAMP policy group.  The CAMT was established to work with a sense of urgency to develop a robust science and adaptive management program to inform both the implementation of the current BiOps and the development of revised BiOps.  CAMT was charged with preparing a workplan for the Court that identifies topic areas where significant disagreement exists between parties and describes how the topics will be addressed through a collaborative science process.

As part of the workplan, the CAMT entrainment team came up with four studies that come down from two management questions that needed answers affecting entrainment of Delta smelt.  Dr. Lenny Grimaldo with ICF International highlighted the team’s findings.

CAMT examined historical (1993-2015) salvage data to determine what factors affected Delta Smelt salvage at the State Water Project (SWP) and Central Valley Project (CVP) fish facilities. The objective was to determine if new approaches could be applied to the data to yield new insights about the factors that explain Delta Smelt salvage patterns within and across years.

These are things that need to be managed in real-time effectively so you are not having huge population consequences, but then the second part is the actual modeling of the population consequences so you know what those impacts are to the species,” he said, noting that a number of other scientists are involved.  “The idea was for us to work together to figure out the key uncertainties so that we could forge a path forward.”

Dr. Grimaldo presented a graph from the 2008 biological opinion, noting that a few things were left out at the time.  “Back in the day, this graph was revolutionary, but what it shows is total salvage versus combined Old and Middle River flow,” he said.  “At the time, we didn’t even know there was this parameter, Old and Middle River flow … we actually published a paper together on this, and here’s this plot showing that basically salvage increases as you have higher net reverse flow.  So this was very interesting, but that was 2008.”

But today, a lot more information is needed than this graph provides, he said.  “We need information on finer scale variability and salvage dynamics, and there are potential variables that we forgot.  Back when we did these initial plots, we only explained about 30% of the variation and salvage … and then fish behavior during first flush events, what really drives those initial salvage events?  And then ultimately the population impacts … however you interpret those population studies, you still have to manage to make sure those salvage events don’t lead to those high entrainments.  So that’s where this work I think is important.”

Mr. Grimaldo said that one of the initial sparring matches within the CAMT team was over conceptual models.

We all thought about the same parameters, but we just thought about differently about how they operated to eventually effect how Delta smelt were salvaged, and most importantly, what conditions create entrainment risk,” he said.  “When I first saw the conceptual model on the right, in reviewing it, I realized that there is important component here in this first flush event that drives salvage, and that is actually very important … but at the end of the day, we all thought about the same parameters that may affect salvage risk, but we just had a different way of thinking about what triggers what from the different drivers down to the responses.”

Dr. Grimaldo said the focus of his talk will be on Delta smelt movements during the first flush.  “There’s another side debate to all this is – is it migration or diffuse dispersal?  There was a paper by Dr. Sommer et al which described the migration of Delta smelt during first flush as a unidirectional mostly migration, but he recognized that wasn’t the case all the time.  After the first flush, the fish that were down near Suisun Bay migrated up to the north Delta or they took the wrong turn to the south Delta and got entrained; they calculated the number of days that it took for smelt to do that.”

A later paper published in 2013 looked at the Kodiak trawl data and made the case that the fish are actually doing more dispersal movements from the channels to points in shore areas and into marshes.

This is important to the patterns that they see related to salvage risk, he said.  “In both cases, whether it’s a migration movement or whether it’s just diffuse dispersal that they are responding to, when first flush becomes more turbid and they are expanding their habitat, the conceptual model was that they were following this turbidity bridge from the lower Sacramento down to the pumps – meaning that if the water was clear, they had a lower risk of entrainment. So that’s the basic premise of both of those models.”

Dr. Grimaldo presented a plot from the Kodiak trawl a very high outflow period in February 2006, and said that in reality, Delta smelt are probably doing both.  “The Sacramento River flow was in the 80,000’s [cfs], outflow was over 300,000 cfs and look at this broad distribution of Delta smelt,” he said.  “We had fish actually go the other way, down to the Napa River, we had a lot of fish go into the Suisun Marsh area, so if you’re a Delta smelt that relies on freshwater for spawning, and that freshwater comes on top of you, you don’t have to move far.  You just have to find the actual spawning substrate.”

He noted that even with the high outflows, Delta smelt were salvaged fish that year, although he acknowledged not a lot.  “These are ripping flows; these aren’t fish just passively deciding to wander their way to the south Delta … the point being, Delta smelt are probably doing a combination of all these behaviors, depending on the spawning habitat that they are looking for, their hardwired behavior and other factors.”

Dr. Grimaldo then presented a set of charts from work he did in 2009 (below, left).  “Some of the highest salvage events (shown in dark black) happen soon after this first flush period of outflow and turbidity,” he said.  He then presented a figure from Wim Kimmerer’s paper in 2008 (below, right) that shows the first flush can happen as early as mid-December.  “You could be done with your salvage events which have been modeled to have huge population effects before the Kodiak trawl even starts; so even before you even have an idea of how big the population size is, you can have these events that have significant impacts to the population.  This is why it is really important to identify the first flush conditions.”

Dr. Grimaldo acknowledged that the Fall Midwater Trawl nowadays is pretty lousy for sampling Delta smelt.  “We get very few,” he said.  “So this problem is even worse as now we don’t even have a gear, so we don’t even have an idea what the size is coming into December.”

Dr. Grimaldo suggested that a Kodiak trawl should start in September?  “We know that it catches fish better than the fall midwater trawl.  I think folks just have to make the leap.  I think folks are used to the Fall Midwater Trawl being this 40 year plus monitoring device, but maybe we need to switch things up, because we know from other work that Ken Newman and Randy Baxter are doing that this Kodiak is a better gear for sampling Delta smelt, so why not go for it.”

This could potentially allow for salvage losses to be evaluated in the context of a recruit responder model,” he said.  “At least you could have an idea going into the salvage season what your salvage actually means.”

He presented a graph plotting days from December 1st against cumulative salvage, noting that in most years, if you were to apply the current service take calculator to the historic years, you would bypass your take in a lot of these early years within a week.  They looked at cumulative salvage up to 25% and as a response, up to 50%; and they also looked overall at the annual salvage.  “We really wanted to see what is driving this initial pulse, and we did this a ton of different ways.  We used GL and negative binomial models, GAM models, and boosted regression trees; I’m going to show snippets from all three.”

Dr. Grimaldo said they all used the same Day Flow data, turbidity data, salvage data, and any data they could find – including predators, because one of the conceptual models is that predators could affect the number of fish that show up at the pumps if there’s high prescreen loss.

He then gave a summary of the results, starting with the GLM negative binomial model.  “For our negative binomial runs, we found during that response at the .5 percentile of salvage, exports was a better predictor than OMR; in both these cases, it was exports and OMR that contributed the most to explaining the variance,” he said, presenting a plot of entrainment risk, noting that the darker means higher salvage.  “Under higher exports of the 25th percentile and turbidity, between about 15 and 60, that’s your big zone of risk.  If you see those kind of conditions, that’s when you’re likely to generate high salvage.  So if you’re a manager and you wanted to watch conditions that are going to predict first flush, this might be a good one.”

He noted that since there’s very little fish around, they probably won’t observe them as much.  “We might not see the smelt but at least we know there’s the potential for the smelt to be there, and that’s the whole point of doing this modeling this way,” he said.  “At the annual level, OMR flow was still the best predictor, and turbidity around 18 – 60 is the high salvage risk.”

Dr. Grimaldo then presented the results from boosted regression tree.  “It’s a new method people are using nowadays to describe association and response with different variables and lots of variables,” he said.  He explained that with this model, you can put in all the different variables and develop a set of predictors; Bill Smith put in about 50 variables and reduced it to four different principle component analyses: outflow, exports, OMR, and suspended sediment.

This principal component of exports, OMR, and suspended sediment concentration is the number one factor that explains salvage at these different time scales,” he said.  “That’s not a surprise, you export more water, you’re likely to catch more fish, but you have to have some turbidity there too.  So you can export water, and if there’s no turbidity, you have a lower risk.  This is another way of confirming what we show with our negative binomial models.”

The size of the Fall Midwater Trawl was the second most important for two of the models, he said.  He also noted there’s a possible relationship between predators in the forebay and salvage.  “I don’t know what that means yet, but the take home was, some level of water exports, whether it’s the exports themselves or index by OMR and suspended sediment, matters at those levels.”

Matt Nobriga (USFWS) looked at the different facilities independently and found similar results, Dr. Grimaldo said.  “Here he had a December outflow index, that’s a proxy for where the fish may be, exports and NTU at the CVP, and he was able to explain 78% of the variation there.  He did pretty well on the state project, too.  So small differences between the facilities, but the same story.

So going back to the conceptual model review, what I would add is it’s not just OMR and San Joaquin River flow; I think we need to add exports and outflow,” he said.  “I would say the suspended sediment is an important physical response to the freshet, and maybe predation.  These are things that we would add to the conceptual model to keep working on.  There are some questions in the conceptual model that can’t be answered by these sensor data alone.”

Presentation acknowledgements …

For more from the Bay Delta Science Conference …

Daily emailsSign up for daily email service and you’ll never miss a post!

Sign up for daily emails and get all the Notebook’s aggregated and original water news content delivered to your email box by 9AM. Breaking news alerts, too. Sign me up!