Why Do Professional Learning Providers Fail the PLPG Review Process?


Claire: Welcome to the PL Reality Check, brought to you by Rivet Education. Today, we’re tackling a subject that I know a lot of people are curious about, specifically, why do providers fail the professional learning partner guide review process? Maybe they’ve noticed that their provider isn’t on the list, or they used to be, but now they aren’t. And they’re just super curious to find out the details. Today, I’m joined by our Professional Learning Partner Guide Review Manager, Sruthi Echarry-Diaz, who is the world’s kindest, sweetest human, but absolutely ruthless when it comes to upholding the standard for PLPG admission. So we’re going to have a frank conversation about why providers fail, what the feedback looks like that providers get, and what it actually means if a provider or your provider is in the PLPG. Hi, Sruthi!

Sruthi: Hi, Claire. Hi, everyone! I am so excited to join the PL Reality Check Podcast, and thank you for that fun intro. And I’m here to pull back the curtains surrounding the actual PLPG process a little bit. This cycle in particular, we had a record number of overall applications, and a record number of newly accepted providers into the PLPG. So a huge shout-out to our 11 new providers who have recently joined the family. We also launched our Multilingual Learners Badge, and had many providers apply for this, along with their Science of Reading and Students with Disabilities badge. However, not everyone makes it into the guide, and I hope to shed a bit of light as to why this can happen.

Claire: Thanks, Sruthi. Okay, so let’s start with the PLPG itself. So, most people, if they’re familiar with the Professional Learning Partner Guide, which I should maybe back up and explain, the Professional Learning Partner Guide, if you’re not familiar, is a tool that we have, a free tool from Rivet Education, and you can filter by different criteria for your professional learning provider, and it provides a vetted list of. So, I think most people who are familiar with the PLPG use it in that way, or know it for that purpose, that they use the filters, they get a smaller list, they know they can trust the list and to help narrow down their decision-making. But it actually has a dual purpose, or a broader purpose, and can you speak to that a little bit?

Sruthi: Yeah, for sure. So, for decision makers, our PLPG, the Professional Learning Partner Guide, is a trusted signal. So if they see providers that are in the guide, that means we, Rivet, have seen strong evidence of their curriculum-based professional learning. And the other purpose is for raising the bar and strengthening CBPL practice in the education field at large through actionable feedback. So we’ve been really intentional at Rivet about how we’ve evolved the PLPG rubrics over time to make sure they’re grounded in both the CBPL framework and what the research actually says about effective professional learning. And we don’t do that work in isolation, even though we’re a small but mighty team, we love partnering with orgs that have expertise in specific fields, such as research, and in this case, we’ve been partnering with the National Implementation Research Network, also called NIRN, on a multi-year validity study to really pressure test what we’re measuring with our rubric and how we’re measuring it. And at the end of the day, the question we keep coming back to is, are we actually capturing the kinds of professional learning that lead to real changes in teacher practice and student outcomes, and are we making that clear enough for decision makers to use? So, it’s not just about making the rubric more rigorous, it’s also about making it more meaningful and more useful for the field.

Claire: Yeah, I know that’s been a big priority for you since you’ve come on board, which is to really use the rubric and the criteria to help providers improve, and you’ve been really intentional about how you’re delivering feedback, in a way that’s useful and actionable for them. It’s not just like, you passed, or you failed. It’s much more detail than that.

So, let’s segue then to probably what a lot of people are curious about: What are the most common reasons that you see providers not passing either the review process, if they’re a new provider, or the renewal process, if they are a returning provider? Maybe you should speak, actually, a little bit first to, like, the renewal process. Why do providers have to renew?

Sruthi: Yeah, I can dive into that. So, once a provider gets accepted into the PLPG, they are then up for renewal every 3 years after that point. And the reason we do renewal applications is because we want to make sure that providers are maintaining the high level of quality in their professional learning throughout the years, and also to hold them up to even higher measures of equity and quality in curriculum-based professional learning. So it’s a chance for us to not only ensure that our providers are on track with providing the high-quality professional learning, but that they’re continuously improving based on new research and findings in the field. So that’s kind of the framework over renewal, and it’s slightly different to a brand new applicant process in that it is more abbreviated, and it only occurs every 3 years.

Claire: But we still see some providers not passing that renewal process. So, I think we’re probably all curious. Why are providers not passing either the renewal or the new application process?

Sruthi: Yeah, there are definitely some patterns we see come up again and again, and something that I’ve made a priority in my role since I’ve joined Rivet is to analyze those trends and really get to the heart of why providers are not passing, and also give them feedback that enables them to improve and hopefully pass in the near future.

One of the biggest trends we see is a lack of clear evidence tying the professional learning directly to the HQIM. So, as we’ve mentioned before, I know we’ve said a few acronyms already, but one of them was CBPL, which is curriculum-based professional learning. And that’s really at the heart of what Rivet is trying to do in furthering the field, is to tie professional learning directly to an HQIM versus it being curriculum agnostic. Through research, experience, and many other factors, we found that teachers and leaders really benefit when the professional learning is rooted and grounded in the HQIM that that school or that district is actually using. So, providers sometimes describe what they do in their executive summary, which is a part of their submission, but it’s not always clear how that work is actually grounded in the instructional materials themselves, and that is evaluated through the artifacts that they submit during the process.

Related to that, we often see artifacts that are either missing or sometimes pretty surface level. So instead of concrete proof, like detailed agendas, tools, protocols, or examples of actual student work, we get more high-level descriptions. And while descriptions give us an initial understanding of what the PL is trying to achieve, it also makes it hard for reviewers to actually verify if the PL experience looks like that in practice without having the actual protocols or examples of student work or rehearsal materials.

Another common gap is around missing context. Our reviewers can’t evaluate what they can’t see. Facilitator notes are a huge component to understanding the depth and the breadth of a PL session, and our reviewers have time and again pointed out how useful these notes are in understanding the intention behind the practice. So when we receive slides from a provider of a PL session without the facilitator notes to provide the back-end context, we can only take what we see at face value, whereas with the facilitator notes, we’re able to really dive into the pace of a session, what questions are asked, what periods of reflection time are given to the audience, and so on.

And then with specific badges, of which we have three, the Multilingual Learner Badge, the Science of Reading badge, and the Students with Disability Badge, we often see that providers have really strong intentions. They truly care about supporting diverse student populations, but there’s still an opportunity to more explicitly build those supports into the professional learning itself. This is something we see very commonly across the field, so it’s not just naming those priorities, but actually showing how the PL is intentionally designed to support learning for multilingual learners or students with disabilities in concrete and actionable ways. So those are just some of the trends that we’ve seen come up over the cycles, all of which are things that can be improved and are specific to the feedback that we give in hopes that providers implement those needed changes.

Claire: Yeah, I’m really glad you brought in the reviewers and getting evidence for the reviewers. So, all of these applications are reviewed by a team of reviewers, right? And then they come to consensus, but like you say, even though these reviewers have experience often with the curriculum, and they’re all experienced educators, they can’t evaluate what they don’t see. So they can’t make assumptions about “when I experienced PL, it looked like this, so they would probably do that.” They’re really looking specifically at the artifacts and the evidence, and they’re kind of performing the role that we would want decision makers to make, which is really probing on these claims that providers make, and that’s a big value of the PLPG that we’re asking these questions ahead of time to make sure that there is evidence to back up the claims that providers are making and really pushing for that.

Sruthi: Yeah, exactly. Our reviewers are at the heart of our work, because they do so much of the evaluation process, and we rely on them for their expertise. Like you said, many of them have prior experience designing professional learning, a lot of them have done research on this topic, and they’ve all been in the education space for many years. And I want to dive a little bit into the actual review process, because I think that’s such an important component of what Rivet does and what we do well, and is something that we’re proud to showcase. And this is actually something that anyone can read about and see videos about and examples of on our process page on the Rivet website. So I highly encourage you to check that out if you haven’t seen it already.

Behind the scenes, once reviewers submit their individual scores and come to consensus, we as the PLPG team look across all of that evidence, the reviewer notes, and ratings to identify patterns, where there’s strong alignment, where there might be discrepancies, and what the data is really telling us about the provider’s offerings. And from there, we synthesize that into structured, actionable feedback. So the goal is not just to tell a provider that you didn’t meet the bar this time, but it’s to clearly explain why, ground it in the rubric, and point to what would strengthen a future submission.

For every application that comes in, our reviewers hold consensus conversations, and these are really thoughtful discussions where a team of three reviewers, each application is assigned three reviewers, come together, look closely at the evidence, specifically those that they weren’t aligned on in their individual ratings, to then talk through and come to consensus on scores and rationale for the final ratings.

As I was speaking to before, we actually have an example of what that looks like on our site in the Our Process page, and we wanted to share this example and put it out there because we want that part of the process to feel transparent as well. So, it’s a really rigorous process, so I feel like it makes sense that not everyone is going to pass immediately the first time when there is such a high bar for getting into the Professional Learning Partner Guide.

Claire: Another part of your role is giving the tough email or tough call that says, “well, I’m sorry to inform you…” So how do those calls or emails typically go, and how do providers react when they get the news that they have not passed this time?

Sruthi: Yeah, I’m glad you asked that question. Those are never the easiest conversations, although some providers make it very easy because they’re so open to feedback right from the start. But I actually think they’re some of the most important parts of our entire application and review process. Once I’ve synthesized the actionable feedback and made sure that it’s targeted and actionable, I send out an initial email to providers letting them know of their application status.

And my philosophical approach to this is to lead with understanding, because we know the care, the effort, and the time that providers put into submitting applications over a span of six months. And we have providers who are national teams with an abundance of resources and personnel, and we also have providers that are one-woman businesses, and everyone in between. So I really want to acknowledge and respect the energy that they put into being a part of our application process.

And then I clearly let them know what the opportunities for growth are and what areas they are already doing very well in, so that they have the entire range of where their services are. And when I compile this feedback, I lean heavily on reviewer expertise, and I also add in what would be natural next steps for them to revise and resubmit their application. And at the end of my initial email, I always leave room for providers to schedule time with me. I’m happy to connect with them over Zoom or a phone call and include any additional team members that they feel would benefit from learning about next steps. And then we really dive into more specifics of the feedback in those calls. Providers sometimes have additional questions, or they’re very curious about what the immediate next step is in the reapplication process. So I really appreciate the opportunity to connect one-on-one and see the human behind all of the applications that are submitted.

And with how providers react, it can really vary, but they are great at asking follow-up questions, connecting with me, and actually implementing those changes to their professional learning. We’ve had some of our wonderful existing providers tell us that even though they didn’t pass for a new badge this time around or fell short on renewals, the feedback pointed out areas of growth in their own PL, and they want to use Rivet’s rubric as a measure of quality, and that’s always very encouraging to hear. And we truly see that payoff. It’s very common for providers to come back in the next cycle or a near-future cycle and pass. So the feedback that we give them isn’t the end of the road for their PLPG journey. It’s just part of the longer-term improvement process in their professional learning services.

Claire: So, if someone is listening right now, they may be asking the question, “What about my provider? What does it actually mean if they’re not in the PLPG?”

Sruthi: Yeah, that’s a really great question and an important one to address. Providers are up for renewal every three years after joining the PLPG, as we discussed earlier. But not being listed anymore doesn’t mean that a provider is ineffective. What it does mean is that we don’t have sufficient evidence to verify that their professional learning meets the bar for high-quality curriculum-based professional learning aligned to an HQIM, as defined through Rivet’s rubrics. So we hold our renewing partners to a high standard to both ensure that their PL is consistent over time and to evaluate that certain metrics for equity and quality are continuously growing. When a provider is listed, you can feel confident that we reviewed their artifacts very closely and have seen clear, consistent evidence of that quality. If they’re not listed, it just means we can’t vouch for that yet based on what’s been submitted.

Claire: Yeah. And I think that makes a lot of sense. Well, I think those are all the questions I had for today, so we’ll call it there. Thank you so much for joining me today, Sruthi. It’s been a great conversation.

Sruthi: Thank you, Claire. It was my pleasure.

Claire: Alright, until next time!

Sruthi Echarry Diaz
Sruthi Echarry Diaz


Professional Learning Partner Guide Review Manager

Sruthi serves as the Review Manager for Rivet Education’s Professional Learning Partner Guide (PLPG). In this role, she helps guide the review process and continuous growth of the PLPG by coordinating with expert reviewers, ensuring high-quality standards, and supporting the development of trusted resources for educators. Sruthi brings a people-centered approach to her work, shaped by years of experience as a dual-language educator and manager of student programs and corporate partnerships in Silicon Valley. Whether building relationships with families, streamlining systems, or championing underrepresented voices, she is passionate about creating inclusive, mission-driven spaces where both people and programs thrive.

OTHER ARTICLES THAT MIGHT INTEREST YOU
Scroll to Top