What did you find out? To what extent were your objectives achieved? Please re-state your objectives.
The program faced limitations in achieving its goals from sources outside the direct control of the program staff. Meeting the screening requirements was not the most difficult part. The protocols in place and methods/processes used to screen at schools, along with appropriate scheduling, allowed for the program staff to achieve the goal of screening all mandated grades at all schools by the end of the month of September. However, the second important part of the goal was achieving examinations at the school using state contracted providers. This required the program to obtain permission from families and/or encourage families to apply to the program.
Previous year experience led the DOE and DOH to design an emergency contact card that included language allowing program staff to apply for families of children that fail screenings. This had been implemented for some time, however, obtaining this signed consent became problematic.
Although schools require the signed form, many referrals cases did not have the forms on file. For the 388 students that were initially identified for processing to Florida Heiken, documentation and approval was only obtained for 253 within the timeframe necessary to have them serviced when the provider was scheduled to come to the various schools around the district. More so, 62 of these 253 still did not receive services because they were either not present for the bus or the school they attended did not receive enough applicant to warrant the mobile unit visit. Thus, these students were instead issued vouchers that allow them to access a medical evaluation, but parents must still schedule time to do this. This results in more time necessary by program staff to call the parents, educate and encourage them to seek evaluation, and then confirm if the child was examined.
Did you evaluate your practice?
Using comparison data from state mandated reports, and focusing on similar size counties and populations, the program could evaluate (based on reported data) how our program was compared to others delivering services.
As of a 12/3/2018 report based on data submitted between 7/1/2018 and 11/30/2018, the following information was used to evaluate the program:
Counties with similar size populations that required vision screenings (12,000-15,000).
The total population screened was
Comparison A1 (N-12,742): 21.95% reported screened, 3.08% of referrals with outcomes
Comparison A2 (N-13,194) : 67.58% reported screened, Not applicable (obvious data reporting error of 1150% of referral with outcomes)
Comparison A3 (N-15,103): 56.99% reported screened, 3.55% referrals with outcomes
Comparison A4 (N-12,973) 63.66% reported screened, 0.48% referral with outcomes
Comparison A5 (N-12,734) 57.89% reported screened, 18.73% referrals with outcomes
Other data used for comparison was based on the DOH-Collier program's overall screening responsibility. Separating our partner data from the DOH requirement, we focused on the 5,980 reported students that were screened by the program. This number represents the total number of students screened by the program in one month, and represents 100% of schools screened by the DOH program without partner involvement. Similar sized counties with a mandated requirement ranging from 5000 to 7000 were looked at for comparison.
Comparison B1 (N-6,702): 74.89% reported screened, 1.82% referrals with outcomes
Comparison B2 (N-5,336): 88.08% reported screened, 0% referrals with outcomes
Comparison B3 (N-5,595): 32.75% reported screened, 0% referrals with outcomes
List any primary data sources, who collected the data, and how? (if applicable)
Data source is aggregated data provided by the DOH school health central office. It is based on reported data from each county. Data is reported with set deadlines for entry. Deadlines are as follows:
December 31st, 45% of mandated population screened
April 1st, 95% of population screened.
There is no deadline for reporting outcome data (referral follow-up), except for end of year reporting. There is also no set percentage of referrals that are required to have outcome data reported. However, it is best practice that outcome data be reported for all referred cases.
List any secondary data sources used. (if applicable)
Internal data sources are used to monitor and measure program efficiency and effectiveness. This data comes from the school's data management system that the DOH screening program uses. It is based on data entered by program staff, and subject to time delays depending on when data is entered and when reports are run. The main use of this data is to monitor the number of mandated students that were screened in school.
Often the number of students that require screenings in the district is larger than the parameter set by the state offices. This is because the metrics set by state office are set at the beginning of the year and enrollment continues throughout the year. As students come into the district, which has a large migrant population, the screening target changes and fluctuates. We use these data sets to help monitor our own internal process and be more accountable to our screening population.
List performance measures used. Include process and outcome measures as appropriate.
The most important performance measure was meeting the outlined schedules for vision screenings and data reporting. Any delay or change in the vision screening schedule would have resulted in the program not meeting the 100% screening requirement it set out to achieve.
There were no delays or adverse events that affected the schedule, resulting in 100% of screening delivered to mandated populations by 6-week deadline.
Describe how results were analyzed.
By achieving the outlined schedule, the program was confident it had screened a majority of the mandated requirement set by central office. Knowing that partner organizations achieved the same goal, the program was confident, but unable to confirm immediately, that the target was either met or would far exceed comparison groups. By December 19th, final internal reports could show that between the program and community partners, 92% of the total mandated population was screened in the month of September.
Based on this information, the program did not achieve its intended goal of 100% of screening completed, but did exceed all comparison groups.
In addition, this goal exceeded 61 out of 66 counties on the December 3, 2018 report. Those 5 excluded counties completed above 92% of their mandatory screening requirements and had a combined total of 7,160 students mandated to be screened. Of those 5, the largest county had 2,186 students that required screenings.
Given the size of Collier County's mandatory screening population, the partnerships involved, and the overall outcomes of the metric used for comparison, the program found that it met its intended goal and can be considered one of the quickest vision screening systems in the state.
Were any modifications made to the practice because of the data findings?
No modifications were made at this time. Future changes will include data reporting and obtaining partner organization data for reporting.