Assumptions That Support Decisions and Processes
As we have evaluated grant-funded education programs, helped schools with School Improvement Plans, and helped educators move to data-driven decisions, we’ve seen many practices that don’t result in good academic outcomes. We’ve mapped out the Skills and Beliefs that lead to what seems to make sense to the people who designed the programs. We think that understanding and reviewing what we have seen may help educators learn what they could be doing to get much better outcomes.
Beliefs and Lack of Skills for Making Sense
One of the programs we evaluated was designed to reduce the dropout rate by serving randomly selected girls with minority sounding names, getting them makeovers and glamour shots in hopes this would raise the likelihood that they passed algebra. In another program, they targeted low-income students. They identified them by looking at the mothers’ pocket books, and the bus they rode home. Once identified, they secretly mentored the students. In another program, a school provided remedial reading services to all students who received free lunch, even though the majority of them already read above grade level.
Understanding Beliefs and Skills Sets
We’ve mapped out the most common underlying beliefs and skill sets that seem to account for how some of the programs that we evaluate might make sense to those who designed them. For each belief and skill set we identified, we identified categories that explain where they are in that domain. Not all domains apply to all situations. For example, one school we worked with had created a data wall . They displayed reading scores from a standardized test along a scale. All of the students with the lowest scores had increased their reading scale scores between 3rd and 4th grade. They celebrated with pizza and were proud to show us. They did not know what the scales differed for that test by grade level. The lowest possible score in 4th grade was higher than the lowest possible score in 3rd grade. Many students had fallen even further behind, but because they didn’t know how to interpret the data, they were celebrating. Only one, possibly two skill sets apply here. They did not have the skills to interpret data. They also may not have had the knowledge or skills for working with data on a computer. Anyone who can use a computer for handling data would never tape paper to the wall instead of using a spreadsheet.
Studying these categories may teach you something you didn’t know that you didn’t know. Try your hand at reading these real examples from evaluations we have done. See if you can spot the underlying beliefs and skill sets that made things make sense to the people running the programs.
Click on the plus (+) sign to read more about each belief.
Belief 1: Cause and Effect
Cause and Effect: Selecting services likely to lead to intended outcomes
- Decides what services are appropriate based on best-practice research.
- Thinks innovative ideas are creative and they’ve been funded in the past, so it’s good to think outside the box.
- Takes advantage of the huge amounts of money available to provide some random service with no accountability for the kids we served or for the outcomes. Profits from poverty because they can.
Belief 2: Expert vs. Evidence
Expert vs. Evidence: Does research and data trump the opinion of an expert?
- Statistical analyses of relevant data should inform practice. Outcomes from last year should influence what we decide.
- Academic data does not tell us enough about the kids to make decisions based on it. We need to use our professional judgement about the curriculum and what will be best for the whole child.
- Some of the curriculum choices were decided by important staff members, and they’ll be very upset and may view anyone who votes against their chosen curriculum as disloyal.
Belief 3: What At-risk Means
What At-risk Means: How to decide who needs services
- We should find the at-risk students who are not proficient at grade level by looking at academic data. Any student below grade level is academically at risk regardless of race or income.
- We should find the students who belong to subgroups with higher percentages of students who are below grade level. (Even if they are proficient, they are likely not to be later.)
- We should have teachers recommend at-risk students who they think will benefit from the program.
Belief 4: Desired Outcomes and Goals
Desired Outcomes and Goals: What you are trying to accomplish
- Believes that having all students master the same academic content is the main goal of an academic course.
- Believes that we should have different academic goals for different kids, based on their potential or their likely career paths.
- Believes that students should be on different learning paths based on how much influence their parents have.
Belief 5: What is STEM and Why We Need to Fill STEM Pipeline
What is STEM and Why We Need to Fill the STEM Pipeline
- Know that mathematical concepts, understanding of technology and how it works, engineering principles, and understanding of scientific contexts are critical for working in STEM fields. STEM fields include programming, cyber security, data analyses, and creating products.
- Believe having STEM skills means being able to use the products that the engineers create, like using an iPad, scanning a QR-Code, creating a youtube video. No real understanding of what STEM careers are.
- Believe STEM means being creative, like crafting, only with things that fall under the umbrella of school science projects—like creating a windmill from paper cups, etc. No real understanding of what STEM careers are.
Click on the plus (+) sign to read more about each skill.
Skill 1: Knowing What Can Be Known
Knowing What Can Be Known: Deciding how to do something or how to identify something
- Knows This Can Be Known and How to Know It
- Doesn’t Know What Can Be Known but Knows to Ask Someone if This Can Be Known
- Thinks This Cannot Be Known
Skill 2: How to Identify Kids to Align Services
How to Identify Kids to Align Services
- Knows how to use a computer to produce a list of all kids who meet specific criteria, or ask someone who can do this.
- Thinks we need to look at information on one kid at a time to produce a list of all kids who meet specific criteria.
- Thinks we need to look at information on one kid at a time to produce a list of all kids who meet specific criteria, but that we can save time by using some information about the kids so we don’t have to check them all, such as free lunch status, or kids getting some special service.
Skill 3: How to Classify Things
How to Classify Things: Organizing information so it can be used in analyses
- Set up a spreadsheet with each column non-overlapping categories with all the variables that will be needed. (ID, grade level, etc.) Ask for help if needed.
- Creates record-keeping methods with overlapping or ambiguous categories.
- Believes each adult should keep their own records their own way, and there is no reason to have a standard way.
- Believes there is no real reason to keep track of which students were served in which programs. That would take time from serving the kids. Maybe survey later to get data about program effectiveness.
Skill 4: Skill Set Required for Working With Data
Having the Skills for Working With Data:
- Has the high-level skill and expertise to analyze education data, on a computer, or knows to get help from someone who does.
- Believes that analyzing education data is a low-level skill, and just about anyone can figure out how to do it using a computer.
- Believes a data-wall, made of paper posted across a wall is a great way to get information from education data.
Skill 5: Understanding Data Details
Understanding Data Details
- Knows the relevant details for interpreting data, such as that scales differ for every grade on standardize tests, course codes change, graduation requirements differ across school districts, etc. or knows to ask someone.
- Does not know what is relevant to interpreting data, and misinterprets data.
Thinks data is data, and any data is as good as any other. There is nothing to know about it. Just get some because we are supposed to use it.
Skill 6: Understanding Federal Data-Handling Laws
- Understand that laws that govern student data are the FERPA laws. They are concrete and federally regulated. Gets advice before sharing any data.
- Believes that those in high places in the local schools can create the requirements for data use and sharing, and they may or may not follow FERPA laws. Believes in local control.
- Has never heard of FERPA. Sends any data requested to anyone, and with no concern.
If any of your programs or practices are like the examples below, we can advise you on how to get better outcomes, often far more efficiently. We can also provide tools to help you use data effectively.
Elementary School Improvement Plan
We were helping a Title 1 elementary school write it’s School Improvement Plan. We started by using their school data to make a Data Profile for them, so they could visualize their school in terms of data. We inventoried the programs and services they already offered, and helped them identify clearly who the target group was for each of them in terms of data. We wanted to determine whether services were aligned to student needs in terms of data. They had no over-arching records of which students got which services. All the adults kept their own records their own way. The person running each program or service knew who they served, but many of them couldn’t produce a list. They had files on each student.
We helped them put together a master data set of all services, and which students got them. They tried at first when we told them this was needed but they soon learned they didn’t have any staff with the proper skill set. So, we did this for them. We created an electronic file that contained all programs, information about who the target group was for them, which students were in those programs, and their relevant data. From this data set, we could see that many students who were scoring at or above grade level were getting pulled from core courses to receive remedial services that they did not need. Some students were receiving the same service twice, from two different staff members. Many students who were in the target group for services were not getting them.
They decided that before starting any new programs or initiatives, they would use this information to get the students aligned with the proper services, and stop providing services to students who did not need them. They would also start using our record-keeping system so that they would have a big picture of who was getting what services that they could compare with the data.
Beliefs and Skills of the Staff
Knowing What At-Risk Means: Many of the students who were getting remedial services were getting them because they were poor. When the principal saw the academic data in terms of who was getting remedial services, she stopped this. The Title 1 staff at the school argued at first that the poor kids were supposed to get these services, but the principal said the top scoring poor kids would no longer be getting remedial services.
How to Identify Kids to Align Services: They did not know how to keep records or run a list of kids who fit a data profile that aligned with a service. They had Edstar do that for them.
They started no new programs, services, or initiatives. They got the services aligned, stopped providing remedial services to kids who did not need them and did provide them to those who did. They kept standard records so they could easily see who got what. This took them until mid-year to get everything straight. At the end of the year, the percentage of kids scoring at grade level or above increased by 22%, and the achievement gap narrowed by 11%. They had the data required to be able to tell which of their programs were effective. Edstar helped them look at the results, and they then wrote their School Improvement Plan to do more of what was working for them, and less of what didn’t move the needle. The principal became data savvy, and so did some of her staff.
Mentoring Low-Income Kids
A school hired us to help them with their School Improvement Plan. They had provided a Poverty Framework professional development for all of the staff in which they learned that low-income kids are not as successful in school as not-low-income kids for several reasons, including their spiritual beliefs, the types of relationships they have, and their speech register, among other things. Each staff member identified a poor student and then mentored them to change them to be more like non-poor students in these areas. They surveyed the staff to have them document how they were mentoring the students, and what progress they were making in these areas. They weren’t sure what to do with the data from the surveys, or how to measure success and we came in in the middle of this project to help them.
We took the roster of students they were serving and matched them to their academic data. Nearly all of the students had already been successful before this project. The school had very few minorities in it, but about 3/4 of the students being mentored were minorities. The staff reported to us that because they did not have the lunch-status list, they “guessed” who was poor, using things like who rode the bus with the most Black students on it, speech register, and clothes.
Beliefs and Skills of the Staff
Cause and Effect: They thought that what they learned in the poverty training explained why poor kids can’t learn, so they wanted to change those things.
Knowing What Can Be Known: They didn’t seem to know that they could know which students are not successful in school by running a list using a computer, of kids who were below grade level.
Identifying Kids to Align Services: They thought that they could use information like what bus the kids ride, or their clothes to determine who needed their services, which they believed were for poor kids.
Skills for Working With Data: Although we did not use their surveys about how they were changing the students, doing so would have been very difficult because they were on paper, open-ended when they could have been forced choice, and generally not usable data.
The kids served were already successful in school before service. Some of them were recommended for remedial services and tracked lower in math because the staff believed they were at risk of failure even though they were successful.
The second year that we worked with this school, they learned to use academic data to align academic support services for students who were not successful. They also learned that successful minority students benefited from placement in rigorous and enriched courses.
A school system hired Edstar to evaluate all of their dropout prevention programs in the district. They were wanting to move toward using data-driven decisions rather than professional judgement. This was a medium to large district. There were a lot of high schools. We got data on everyone that was served by their dropout programs over the previous few years. We had them articulate for us how students were identified to be in these programs, and what the objectives of the programs were.
Students were referred to the programs by teachers and school counselors for being “at-risk,” but “having potential.” However, they told us that all of the student were below grade level on standardized tests, had poor reading skills, and were behavior problems–either with suspensions or attendance. They said their objectives were that these students would pass algebra, and English 1, both of which were required.
Beliefs and Skills of the Staff
Knowing What At-Risk Means: Although they had specific data profiles in mind for who the programs should serve, instead of getting a list of kids who had that profile, they asked for referrals of at-risk kids. Staff who referred kids thought low-income and minority kids were at-risk. And the brightest ones “have potential,” which is why they referred so many academically successful kids.
How to Identify Kids to Align Services: They did not know how to use a computer to get a roster of kids who needed help passing algebra or English I. Many of the kids being served had already passed those classes.
We got the data and found that more than 80+ of the students they had served in their dropout prevention programs had always scored at or above grade level in both math and reading, had never been suspended, and had no attendance problems.
We compared the students who did not fit the profile for the program to a matched comparison group. The kids who should have never been in the program, but were, dropped out at a significantly higher rate than their control group. About a fourth of these kids dropped out. The kids who fit the profile of who the program was designed for dropped out at a significantly lower rate than their control group. Looking at the data as a whole, the programs looked ineffective. But, they were effective when they served the right kids.
We helped them use data to identify the kids in their intended target groups, and showed them how asking for referrals gave mostly the wrong kids. They started serving the intended students and their dropout rate significantly decreased. Their achievement gap also closed, probably because they quit putting their brightest minority students into dropout prevention instead of the most rigorous courses.
Can We Know This?
A grant-writer for a school system told me she was supposed to report what the demographic makeup of a school would be if the magnet school recruited and enrolled specific numbers of minority students. She told me that she thought the kids would have to be there before she could compute the ratios. She was telling someone else that this cannot be known, yet she was supposed to give the number in a proposal she was writing. The person she was talking to told her that there may be a way to know what the demographic makeup ratios would be given specific changes in the number of minority students, and that I would not only know if there was a way to compute this, I might even be able to do it. Sure enough, I not only knew this could be known, but I was able to put formulas in a spreadsheet so they could play around with different numbers and get the ratios.
Beliefs and Skills of the Grant Writer
Knowing What Can Be Known: She didn’t know that it was possible to compute the demographic ratios hypothetically, but she thought I might know if it was possible.
I made a spreadsheet for her with formulas to let her play around with different numbers moving in, that computed the resulting ratios for her. She wrote the grant, and got the multi-million dollar magnet grant for the school system.
A grant-funded program’s purpose was to help students who are not proficient in math pass a required algebra course so they can graduate on time.
The program enrolled students whose last names sounded as if they are minorities, because minorities are known to be at risk of not being able to pass math courses. The program director had to set up the program and needed to hire site coordinators. He didn’t know how many students would need to be in this program because the list of names was so long, and he could not be sure all minority-sounding names were actually minority. For example, he thought the name Williams may be either Black and White. So, he didn’t think it is possible to know. He selected the kids whose names really sound minority for sure. He didn’t check that the kids he selected at random had low math scores or possibly had even already passed algebra. He was confident in the research that says that minority students are at risk of failing math. Even if they were currently passing, because they are at risk, they could start failing at any time. He used grant funds to allow the girls in the program to have their photos taken at Glamour Shots. He reported to us that the photos would boost their confidence, and confident people pass math more than people with no confidence. (This is real. This was a 21st Century Community Learning Center Grant in a high school.)
Beliefs and Skills of the Program Manager
Assumptions about Cause and Effect: He thought creating confidence with glamour shots would raise algebra scores. There is no evidence that these two things are connected.
Evidence vs. Experts: He thought his expert opinion about this innovative service was better than looking at research on programs that had evidence of helping kids stay in school, or helping them pass algebra.
What At-Risk Means: He thought all minority students are at risk of dropping out.
How to Identify Kids to Align Services: He thought he could identify who is likely to drop out by identifying minority students, and that he could identify them by reading a list of last names.
We pulled pre-and post-academic data for the students who were served. More than half of them had already taken algebra and passed it. Of the students who had not already passed algebra, two-thirds had always scored above grade level in math and had not been at-risk of failing algebra. The remaining one-third failed algebra. The glamour shots had no affect at all on passing algebra. The Program Director argued that the glamour shots were effective because nearly everyone in the program passed algebra even though they were minorities. He went on to run another grant-funded program that we evaluated and he enrolled students the same way in it.
Students in the program were viewed as “at-risk” simply because they were in the program. They became less successful in school.
Create a Data System
A university realized that using data-driven decisions and aligning research-based interventions with student needs was being favored in getting federal Department of Education grants. They wrote a grant proposal and included Edstar in it to have us create a predictive data analytics system (like EVAAS) and described that it would take us about a month and cost about $5,000. They wrote in their grant that the schools they were going to work with would use this system to identify kids who might be successful in advanced classes.
The Beliefs and Skills of the University Staff
Having Skills for Working With Data: They thought that creating a data analytics system that could be used to make academic predictions was very easy, and low-skilled. It wasn’t worth much and wouldn’t take very long.
We knew they wouldn’t get the grant because of how wrong their assumption was. We tried to tell them about EVAAS, and that the schools already have access to it. They said they would rather work with us, and wanted us to get the $5,000 to do this. They thought we could do a better job than SAS. They didn’t get the grant, which was a given.
Supplemental Educational Services
A school system was delivering remedial reading instruction to all students who received free lunch. This was to satisfy a NCLB sanction for not making adequate yearly progress with all subgroups. The goal of the intervention was to bring the students to grade level proficiency. All of the curriculum being used was remedial. The majority of the students receiving services received free or reduced-price lunch but read at or above grade level. When we pointed this out, the school system thought we were accusing them of serving kids who were not poor. The outside tutoring companies that were providing the services had no academic data on the students. Of the four agencies, one had tested the students before tutoring them, but reported to us that although many of the students’ pre-test scores were very high, they only had remedial curriculum so they used it for everyone.
Beliefs and Skills of the Staff
Knowing What At-Risk Means: This school was serving the students that they were required to serve, due to not making Adequate Yearly Progress with a subgroup for two years. They were required to offer tutoring to all free-lunch kids. They were not alone in thinking that all of these kids needed remedial tutoring. We evaluated about 20 of these programs, and this is what they were all assuming without looking at the academic data of the kids.
We conducted interviews and focus groups with the teachers in the school as part of our evaluation. They reported to us that the students who were being tutored were below grade level in reading and math. They said they must be or it wouldn’t make sense to tutor them with remedial curriculum. As a result of the teachers believing this (even though the majority of the kids were above grade level), they developed low expectations for these students. They recommended the low track for these students who went on to middle school, even if they were high scoring. This particular school system continued to conduct this program the same way the following year, and quit evaluating it. A few years after our evaluation, they evaluated the program again and asked the staff in a survey if they thought they had the right kids. They staff thought they did, so they considered the case closed.
The head of the evaluation department of this school explained to Edstar staff that because the school assignment policy was based on the belief that low-income kids were academically at-risk, that treating them otherwise would defy school policy. He prohibited using academic data for making service and placement decisions, and vilified Edstar because we helped schools learn to use academic data.
Raising Achievement Closing Gaps
We have worked with many school systems to help them use data to close their achievement gaps. The most effective thing we have seen is to use data to align services. This includes enrolling top scoring students into the most rigorous courses. We have seen that low-income and minority students who score higher than the average scores in the most rigorous STEM classes are often in standard or even remedial courses. We have helped many schools identify these students using data. All schools in North Carolina have access to a data system named EVAAS. EVAAS Academic Preparedness Report is the best way to identify students who are predicted to be successful in the most rigorous courses.
We worked with one middle school where we used EVAAS and identified 100+ overlooked students who should have been in 8th grade algebra. The principal only enrolled the top half (with the highest predictions) of these kids in to algebra. All 50 of the students she put into algebra were successful. 96% of them continued to take advanced math in high school and were successful. In another school, the principal moved into the top track over 100 low-income and minority kids, based on data. They were all successful. We have seen in many school systems that using data to identify students for enrollment into the most rigorous courses raises the schools’ achievement and closes gaps. In many other schools, we simply showed them how to use EVAAS to identify the students likely to be successful. One school increased the number of students in their most rigorous courses four fold. In every case, overall school achievement rose and gaps narrowed significantly.
Beliefs and Skills of the Staff
Having the Skills for Working With Data: None of the schools we worked with had the skills to run EVAAS reports to identify kids who were likely to be successful in the most rigorous courses. So, we did this for them. We also taught them how.
Knowing What Can Be Known: They did not know that they could run an EVAAS report, taking 5 minutes, and get a list of names of the kids who were likely to be successful in advanced courses.
Achievement raises and gaps close. This changes the culture of the schools to enroll low-income and minority students in large numbers into the most rigorous courses. There are a lot of cultural issues to address when this is done. Some schools successfully address those issues and have success. Other schools run into conflict when the most elite parents incorrectly believe the courses must be watered down if low-income and minority students are in them. Stakeholder communication is critical.
Re-Enrolling Long-Term Suspended Students
A large school district hired Edstar to evaluate the effectiveness of a federal grant-funded program that was designed with the goal of having long-term suspended students re-enroll in school after serving their suspension, and be successful. The grant paid for additional school counselors to case-manage the students while they were out of school, connect them to services, including enrolling them into the school system’s alternative schools while suspended so that they could earn credits, and not be so far behind.
The school system contracted with a private firm that ran the alternative high school. Although far more students were suspended than this school’s capacity, the alternative school was never even half full. The school system did not understand why. A major objective was to get the kids to enroll in this school.
Although nearly all suspended kids could enroll in this alternative school, the school counselors reported to Edstar that most of the kids they were case managing got letters from the school system that said “No Offer” next to the name of the alternative school. We confirmed with the office that sent the letters that with few exceptions, kids could enroll in this school. This was a mystery for the first year. The school counselors were very frustrated.
After countless meetings and interviews with the office that sent the letters to the suspended students, we discovered that there was a field in their data system with the variable for “No Offer” to the alternative school. Because the condition “No Offer” was so rare that they never needed it, they were using this field internally to indicate that the paper work was not complete. A secretary generated and sent the letters, which was a feature of the data system. Because no one talked to students once they were suspended, they had not realized that the kids and their families thought they had “No Offer” to enroll in the alternative school. These counselors thought the same thing when they saw the letters. We got this fixed after the first year of the grant. The alternative school, which had been mostly empty for years, filled up.
Another glitch that we uncovered later was that when kids took courses at the alternative school, the courses did not have the course codes used by the school system because it was a private school. In many cases, they did not get the credits they had earned because the data manager at their base schools did not know how to enter the courses. In some cases, the data managers were deciding how to translate them, and they were not doing it all the same way. In other cases, they simply told the kids they couldn’t have the credits. Edstar then worked with the school counselors and the data department to create a standard way to translate the course codes.
Beliefs and Skills of the Staff
Understanding Data Details: The office that worked with the suspension data had operated for years without any feedback after they generated the letters. They didn’t understand how re-purposing that No Offer field without changing it affected the letters that were generated and sent to families.
Without these school counselors acting as advocates, the kids had previously had no recourse when they did not get the credits earned at the alternative schools. No one was aware of this. No one with authority had realized that the schools needed a way to translate the course codes from the alternative school.
The grant funded the additional school counselors for 5 years. By the end of 5 years, the dropout rate for long-term suspended students had significantly decreased, the alternative school was full, and kids were re-enrolling and being successful. We also learned a lot about what kinds of services were most effective for helping the kids.
The head of school counseling and Edstar published a paper about what was effective in a peer-reviewed journal.
The school system made these additional school counselor positions permanent when the grant ended, due to the documented effectiveness of the program.