PLANNING AND EVALUATING EDUCATIONAL PROGRAMS TO INCREASE ADOPTION OF SAFETY PRACTICES AMONG PESTICIDE APPLICATORS

Susan Whitney, Extension Pesticide Coordinator
University of Delaware Cooperative Extension System
Newark, Delaware

  • Introduction
  • Glossary of terms
  • EDUCATING TO INCREASE ADOPTION OF SAFETY PRACTICES
  • PLANNING AND EVALUATING EDUCATIONAL PROGRAMS
  • Acknowledgements
  • Literature cited
  • Appendix A
  • Appendix B
  • UNIVERSITY OF DELAWARE COOPERATIVE EXTENSION

  • PESTICIDE APPLICATOR SAFETY INTERVIEW
  • Appendix C

  •  

    Pesticide educators today are expected to evaluate Pesticide Applicator Training (PAT) programs in terms of how many applicators changed behaviors as a result of training. In 1991, Wintersteen and Poli stated "Documenting success gives interested parties an indication of the value of the PAT program... Conversely, an undocumented program is a fragile thing - virtually defenseless and by all appearances, unworthy of defense." Canerday (1989) said, "...it has become quite clear that no longer is the supporting public going to blindly continue to shovel money into the gasping mouths of every public service agency, without requiring that agency to stand and be judged in the court of accountability." Many professionals are reluctant to evaluate their programs. Ladewig (1994) pointed out reasons for the unpopularity of evaluations: we don't want others to know of our failures, we don't know how to establish value, we don't know how to collect evidence for evaluation and we don't have the time or money for evaluation. This manual attempts to counteract all four of these objections.

    Don't want others to know of our failures
    While it is certain that you will uncover some failures when you do an evaluation, knowing what the failures are will help you correct them. Then you will have successes to report and applicators will have better programs!
    Don't know how to establish value. Don't know how to collect evidence for evaluation.
    This manual takes you step by step through the evaluation process. It starts with designing a training program, describes data collection, explains statistical analysis and ends with applying the results of the evaluation to your next pesticide applicator education event.
    Don't have time or money for evaluation.
    None of us has unlimited time or funds. This manual covers three methods of evaluation that are cost and time effective.
    Before beginning a description of the three methods, it is important to define the terms involved in evaluation. The following glossary has been adapted from Ladewig, Nowak (1994), and Salant and Dillman (1994):

    EVALUATION
    To evaluate is to determine the comparative worth of something. The word comparative is important here. Evaluation does not set an absolute worth. One must always compare the result of an educational program to something. That something may be a goal or the result of training at another time or place. The purpose of evaluation can be to learn how to make a program better or to justify the existence of the program or both.

    VALIDITY
    If the evaluation measures what we set out to measure, then the procedure is valid. If we wanted to know how many applicators wear respirators, our evaluation instrument would not be valid if it measured the number of applicators who knew about wearing respirators or who thought they should wear respirators. These instruments would measure attitude.

    RELIABILITY
    If the evaluation measures the same way each time it is performed, then it is reliable. A survey administered by untrained enumerators may not be conducted the same way by each person and, therefore, would not be reliable.

    GOAL
    The target of the educational program is the goal. It may be to change attitudes, knowledge, skills, behavior or impact on the community. This manual is concerned only with measuring behavior changes.

    REPRESENTATIVE SAMPLE
    When a sample is drawn at random, it is representative of the population. If the sample is not representative, then the evaluator must characterize the sample and characterize the population. The evaluator must point out differences and similarities and take these into consideration when drawing conclusions. Using a non-representative sample does not mean that the evaluation has to be thrown away. It just means that the researchers will have extra work to do and they must make conservative generalizations.

    COVERAGE ERROR
    A sampling frame is a list of persons from which a sample is drawn. If the sample frame does not include all elements of the population that the researchers wish to study, then there will be coverage error. The degree of coverage error is a function of how different the missing, ineligible or duplicate entries are from the target population.

    SAMPLE SIZE AND SAMPLING ERROR
    The number of individuals included in the evaluation must be large enough to yield meaningful results. Unfortunately, researchers sometimes don't find out that the sample is too small until the evaluation is completed and the statistical analysis gives unusable results. An evaluation with a sample size too small will have too much sampling error. This will not allow the researcher to confidently generalize from the sample to the population. Always use a sample size as large as time and funds will allow. And be ready to increase the sample next time if there is too much sampling error.

    QUESTIONS AND MEASUREMENT ERROR
    Questions on an evaluation have to be asked in an objective manner and in a format that will not encourage the respondent to give the answer he/she thinks you want to hear. Measurement error is the difference between a respondent's answer to a question and the real answer that the respondent should have given. This source of error may come from the way the questions are worded or the belief by the respondent that the evaluator wants a certain answer.

    NON-RESPONSE ERROR
    If a significant number of people do not respond to the evaluation and if these non-respondents are different from those who do respond, there will be non-response error. One can never assume that non-respondents would have answered in the same way as the respondents. This manual gives some standard methods to increase the response rate in a mail survey.

    Salant and Dillman list four characteristics of a perfectly accurate survey, after stating that such a survey is seldom, if ever, conducted:

    1. every member of the population would have an equal (or known) chance of being selected for the sample,
    2. enough people would be sampled randomly to achieve the needed level of precision,
    3. Clear, unambiguous questions would be asked so that respondents are both capable of and motivated to answer correctly, and
    4. everyone in the sample would respond to the survey, or non-respondents would be similar to the respondents on characteristics of interest in the study.

    EDUCATING TO INCREASE ADOPTION OF SAFETY PRACTICES

    There are several reasons why people adopt new behaviors:
    1. The new behavior holds some value to them.
    2. The benefits are greater than the cost.
    3. The person has the resources for making a behavior change:
    4.   knowledge labor
        time equipment
        skills money
    5. They see the behavior as attainable.
    6. They get support and reinforcement for making the behavior change from peers, family, society or from themselves.
    7. The old behavior is no longer really an alternative, or it is not as desirable.
    There are also barriers to adopting new behaviors:
    1. The new behavior has no value to them.
    2. The costs are too high.
    3. The person does not have the resources for making a behavior change:
    4.   knowledge labor
        time equipment
        skills money
    5. They don't see the behavior as attainable.
    6. They don't get support or reinforcement for making the behavior change from peers, family, society or from themselves.
    7. The old behavior is still an alternative and probably very easy to fall back on.
    There are five distinct, predictable steps that people go through before they adopt a new behavior (Andreasen, 1995). Depending on personality and type of behavior change, some people may move through these steps very quickly, while others seem to drag their feet. Pesticide educators can take advantage of the five steps to help move applicators along the path to adoption of the new behavior.

    Step I. PRE-CONTEMPLATION

    When you introduce a new practice, applicators may not think that the procedure applies to them. They may feel that the behavior is inappropriate. For example, assume that you want applicators to wear long sleeved shirts during mixing, loading and application. The applicator hears the message, but thinks that wearing long sleeved shirts isn't meant for him/her. That's something that "the other guy" should do.

    At this time you must provide facts, create awareness and interest in the new behavior. You could tell the applicators how much contamination lands on the forearms during handling. You could demonstrate with florescent dye and blacklight how much exposure is avoided with long sleeved shirts or forearm protectors. Fact sheets and newsletter articles would reinforce this information.

    Step II. CONTEMPLATION

    At this stage, applicators begin thinking about the consequences of the new behavior -- the positive and negative aspects of it and what others expect them to do. They consider the risk of exposure to pesticides and the protection provided by long sleeved shirts. The applicators are aware that others expect them to protect their forearms.

    At this time you need to de-market the old behavior -- eliminate competition from old behavior patterns. You should discourage the old behavior of wearing tank tops, and t-shirts. You also need to show how benefits exceed costs. Explain that the benefits of reducing exposure to pesticides (reduction in illness, doctor bills, and lost pay from sick days) are greater than the costs of wearing long sleeved shirts (heat stress and temporary discomfort).

    Step III. PREPARATION

    Now the applicator is planning strategy to try the new behavior. At this stage he/she needs to view the costs of the new behavior as being low enough to wear long sleeved shirts or forearm protectors. You should stress that the costs of taking breaks and drinking water to prevent heat stress are not that great. Express it in per-cent of cost of the product or service.

    Step IV. ACTION

    At this point the applicator views the new behavior as achievable and attempts it at least once. Now you need to show the applicators that they have (or can get) the knowledge, skills, time, resources and labor to protect their forearms regularly. Provide information on:

    Step V. MAINTENANCE

    Finally, the applicator completes trials in the new behavior and becomes committed to protecting the forearms. The new behavior is integrated into his/her routine. Your job is not done though. You must regularly support and reinforce efforts to maintain the new behavior. You could: Knowledge of how people move through these five steps will help you design more effective educational programs. Before running a workshop, do a needs assessment to determine how many in the audience are at each stage. If there are applicators at stage 1, your presentation needs to start there and carry the audience through all five steps. Reward applicators who have adopted and maintained safety practices. But keep in mind that not all applicators will move through the five stages to behavior adoption. As shown in Table 1, educators can predict how many people will adopt new behaviors.
     
    Table 1. Proportion of individuals adopting new behaviors (Adopted from Rogers, 1983)
    Category % of Population Personality
    Innovators 2-3 Younger, more formally educated, wider information net, more affluent, risk-takers. Tend to under- conform to family, community, societal standards and norms.
    Early Adopters 12-13 Similar to the innovators, but to a lesser extent. Still somewhat under-conform to standards and norms.
    Early Majority 34  Age varies, less formally educated, more restricted information net, average income, less inclined to take risk. Setting and conforming to the norms and standards or reference groups is more important.
    Late Majority 34  Older, less formal education, less affluent, limited risk-taking exhibited.
    Laggards 16 Over-conform to standards of family, community, and society. See no benefit economically, socially, culturally, or politically if "idea" is accepted.

    PLANNING AND EVALUATING EDUCATIONAL PROGRAMS

    The first step in measuring adoption of safety practices among pesticide applicators is to choose the method of evaluation. Three methods are presented in this manual. Each has its advantages and disadvantages. Choice of method will be determined by the available resources, the educational activity, and the audience. Each pesticide educator should carefully review these three methods and choose the one best suited to his/her needs. Each method will stand on its own as a statistically valid measurement tool. There is no need to add variables or to try to "measure the universe." Instead the evaluator should clearly express the results of the study within the constraints of the study design. One can only describe what one measures; but, one can measure what one sets out to study.

    METHOD I: PESTICIDE USE-OBSERVATION

    PROGRAM PLANNING TO IMPROVE COMPLIANCE WITH A TARGETED PESTICIDE HANDLING ACTIVITY/POLICY IN THE STATE


    INTRODUCTION

    The purpose of this evaluation method as well as the two that follow, is to identify strengths and weaknesses in Pesticide Applicator Training (PAT) and Worker Protection Standards (WPS) Training as they relate to "real world" situations for the pesticide handler/applicator. This method targets one pesticide practice for improvement through increased educational programing. Success of the educational efforts is measured by compliance with the targeted behavior. Compliance rate is obtained from pesticide use-observations conducted by the state lead agency (SLA).

    PROCEDURE

    I. Target Behaviors
    Staff from the SLA and state cooperative extension system (CES) should discuss patterns of pesticide handling compliance in the state. One pesticide use activity or policy should be chosen as a target. The targeted activity/policy may be the most common compliance problem, the most serious, the activity with the greatest chance of change, or a combination of these factors. Each state should establish its own criteria when deciding which compliance activity/policy to target.

     

     
     
     
     
     
     
     

    Gather base line data before you start the study. State inspection check lists and/or photos from pesticide use-observations should be reviewed for the past pesticide use season. Records should be made of numbers of handlers who comply with the targeted activity and numbers who do not comply. While trying to gather the base line data, you may find that the inspector check list is not designed to measure the pesticide practice that you wish to target. At this time you could pick another behavior to target. Make sure it is one that can be measured from the current check list. Or you may decide to revise the inspector check list and schedule this evaluation for a latter date.

    II. Plan Educational Events and Set Goals
    SLA and CES staff should plan educational events that will encourage handlers/ applicators to adopt the targeted practices. The audience must be certified applicators. Educational events should be chosen that will reach as large an audience as possible. In most states, only a small proportion of certified applicators will be selected for use-observations by the SLA. Education efforts should be designed to ensure that the few individuals selected for observation have in fact received the educational programming. Thus, far reaching educational programing is essential. Examples are newsletters that are mailed to all certified applicators; direct mailings of fact sheets or brochures to all applicators; required recertification courses; or popular, well-attended grower meetings. Designing a combination of these educational events will increase the chance that those applicators selected for use-observation have been exposed to the educational efforts. Keep in mind that it is not legal to target attendees who attended a specific training program for use-observation by the state. In some states that would constitute harassment.

     

     
     
     
     
     
     
     

    The next step is to set the goal. The organizers should decide how much of an increase in compliance they can expect given their educational efforts, the attitudes of the regulated community, and public sentiment. As a guideline in setting the goal, the organizers should look at the success rate of comparable educational events in the state or other states. If similar projects have not been conducted, then the organizers must look to the breadth and depth of their planned educational events. A short term event for few handlers will probably result in a small increase in compliance state wide. This increase may not show up in the season's use-observations. If the event is more extensive, then a larger goal can be set. However, organizers should not plan resource-consuming educational events out of fear that a measurable behavior change won't be obtained. Such negative results can be used as a guide to planning the next year's educational events. As a final guideline, organizers should look at their base line data. A pesticide practice with a low compliance rate should increase with education, while a practice that is already accepted by applicators may not show improvement even with directed education.

    III. Prepare Evaluation Instrument and Validate
    Usually when conducting an evaluation, the researcher must show that the instrument and procedures measure what is intended in a consistently reliable manner. In the context of this study, validation would document that each use-observation is done the same way as all other use-observations in the state. Validation would also show that the inspection instrument and procedure tells the SLA whether the applicator is complying with selected pesticide handling practices. EPA supports annual inspector training in each region. This training provides the consistency needed for EPA to document that use-observations are valid. As long as this evaluation in your state is conducted by state inspectors who have attended EPA supported annual training, no further validation is needed.
    IV. Gather Data
    It is important to show cause and effect between the educational efforts and compliance. This can be measured by asking applicators if they read the newsletter, brochure or fact sheet or if they attended the recertification program or growers meeting. Applicators who were not exposed to the educational efforts must be dropped from the study. Those who were exposed to the educational event may be retained in the study.

     

     
     
     
     
     
     
     

    The study should be divided into three equal time periods: before, during and after the educational events. State inspection check lists and/or photos from pesticide use- observations should be reviewed for the period preceding the educational events. Records should be made of numbers of handlers in the study who comply with the targeted activity and numbers who do not comply. The proportion of compliance should be compared to compliance during and after the educational events.

    V. Analyze and Interpret Data
    Comparisons should be made between the goal and the observed rate of compliance. Comparisons can also be made between the base line data and compliance during the educational events and compliance after the educational events.

     

     
     
     
     
     
     
     

    With sample sizes of 120 applicators or more, the Z test of pair-wise comparisons of proportions (Moore, 1993) can be used to test the hypotheses that the observed proportion is different from the goal:
     

    Where:
    Z = the test statistic

    g = the goal -- the number of applicators you hope will comply with the targeted behavior.
    o = the observed number of applicators who comply with the targeted behavior.
    n = the number of applicators inspected by the state during the use-observation who were exposed to the educational effort.
    If the Z value is greater than 1.96, then the difference between the goal and the observed value is statistically significant. If your goal is higher that your observed value, you have not reached your goal. If your goal is lower that your observed value, you have exceeded your goal. If the Z value is less than 1.96, then the difference between the goal and the observed value is not statistically significant. This means that you can show no difference between the goal and the rate of compliance and you can consider your educational efforts to be on target.

    If you are working with sample populations of fewer than 120 applicators, you need to consult a Student's t-distribution table or your statistician.

    The Z test (Salant and Dillman) can also be used to test the hypotheses that the base line value is different from rates of compliance observed later in the study:


     
     

    Where:
    Z = the test statistic

    a = the number of applicators who comply with the targeted behavior during the first time phase.
    b = the number of applicators who where exposed to the educational effort during the first time phase. For calculating base line data, b = the number of applicators inspected during the use-observation.
    x = the number of applicators who comply with the targeted behavior during the next time phase.
    y = the number of applicators who where exposed to the educational effort during the next time phase.
    The examples given in the case study below show how to use both formulas.
    VI. Apply Results to Pesticide Education
    If the evaluation shows an increase in compliance with the targeted activity after educational events, a state may theorize that the educational events contributed to compliance. The results from this evaluation can be used to design more effective educational programs in the state. Program coordinators will learn how extensive a program must be to have an impact. Other compliance activities/ policies in the state can then be targeted in the same manner as the first.

    ADVANTAGES OF THE USE-OBSERVATION TECHNIQUE

    1. Targets one specific activity in the state.
    2. Study sample randomly chosen; therefore, it is representative of all applicators in the state.
    3. Uses existing data gathering forms -- the SLA state inspector check list.
    4. Uses existing data gathering procedure -- the SLA use-observation.
    5. Data gatherers (state inspectors) are already trained at annual regional meetings.
    6. Validation of procedure already done by EPA.
    7. Fosters cooperation between SLA, CES & other agencies.
    8. No mailings

    DISADVANTAGES OF THE USE-OBSERVATION TECHNIQUE

    1. Educational programing must reach the entire population of pesticide applicators.
    2. Extra question must be added to state inspector check list to establish cause and effect relationship.
    3. An employee must to go through inspection records and extract data.
    4. Small sample size of use-observations may result in conservative generalizations.

    CASE STUDY: THE LONG-SLEEVED SHIRT CAMPAIGN

    The activity targeted for increased compliance in this hypothetical case study was the use of long-sleeved shirts during mixing/loading/ application by licensed pesticide handlers on farms, forests, nurseries and greenhouses. The educational events were articles in the state pesticide quarterly newsletter, "Pesticide Briefs.". The newsletter was published jointly by the SLA and the CES. It was sent free of charge to all licensed applicators in the state. Stories emphasizing the importance of long-sleeved shirts were run for one year. A goal of increasing compliance among handlers by 10% was set. A relatively low goal was chosen because the effectiveness of newsletter articles was unknown, the number of applicators actually reading the newsletter was unknown and similar studies were not available for comparison. The goal was not set lower because the educational event, newsletter stories, was able to reach all the state's licensed applicators.

    Records were made of the numbers of applicators wearing long-sleeved shirts for the year before the newsletter stories began, the year while the stories were running and one year after the stories appeared. After the study began, state inspectors asked the additional question, "Do you read Pesticide Briefs?" Records were made of numbers of handlers inspected in each time phase and numbers who read the newsletter. 


    Table 2. Numbers of applicators complying with pesticide label by wearing long-sleeved shirts. Hypothetical case study.
    Time phase Numbers of Applicators %
    Inspected  Read "Pesticide Briefs"  Wear long sleeves
    Before education 121 25 20.7 a
    During education 128 105 36 34.3 b
    After education 125 100 30 30.0 a,b
    * Values in a column followed by the same letter are not significantly different at P=0.05 level (Z test).

    These results show that there was a significant difference between the rate of compliance before newsletter stories and during newsletter stories. No firm conclusion can be drawn about the lasting impact of newsletter stories. The Z test shows no significant difference between either "before education" and "after education" or between "during education" and "after education." With results like this, it is wise to draw conservative conclusions. 


    Table 3. Numbers of applicators complying with pesticide label by wearing long-sleeved shirts compared to the goal of a 10% increase. Hypothetical case study. During educational event.
    Time phase Numbers of Applicators %
    Inspected  Read "Pesticide Briefs"  Wear long sleeves
    During education 128 125 43 34.4a
    Goal 38.4 30.7 a
    * Values in a column followed by the same letter are not significantly different at P=0.05 level (Z test).

    No significant difference was found between the goal and rate of compliance during the run of newsletter stories. 


    Table 4. Numbers of applicators complying with pesticide label by wearing long-sleeved shirts compared to the goal of a 10% increase. Hypothetical case study. After educational event.
    Time phase Numbers of Applicators %
    Inspected  Read "Pesticide Briefs"  Wear long sleeves
    After education 125 123 37 30.1a
    Goal 37.8 30.7 a
    * Values in a column followed by the same letter are not significantly different at P=0.05 level (Z test).

    And in this case, no significant difference was found between the goal and rate of compliance after the run of newsletter stories. After analysis, the researchers concluded that CES/SLA newsletter stories increased compliance among pesticide handlers by at least 10%. If less than a 10% increase in compliance had been reached, educational efforts would have been expanded and the study repeated. Next the researchers can conduct training with a goal of increasing compliance by a higher percentage.

    METHOD II: PESTICIDE APPLICATOR TRAINING EVALUATION FORM

    PROGRAM PLANNING TO INCREASE ADOPTION OF SAFETY PRACTICES THAT REDUCE EXPOSURE DURING PESTICIDE HANDLING ACTIVITIES POSING THE GREATEST HAZARDS TO APPLICATORS IN THE STATE


    INTRODUCTION

    This method targets the most hazardous pesticide practice as defined by licensed applicators who attend training. Applicators pledge to adopt new behaviors designed to reduce exposure during this hazardous activity. Success of the training is determined by the proportion of applicators who adopt the new behaviors after training. This proportion is measured through use of an evaluation form that has been validated in Delaware by face-to-face interviews.

    PROCEDURE

    Plan Educational Events and set goals
    Staff from the SLA and CES should plan an educational workshop that includes personal safety procedures for pesticide handlers/ applicators. This may be initial certification training, recertification training, or WPS training. Organizers should estimate the proportion of handlers/applicators who will adopt new safety practices as a result of the training. As a guideline in setting the goal, the organizers should look at the success rate of comparable training in other states. If similar projects have not been conducted, then the organizers must look to the breadth and depth of their training.
    Print evaluation form and validate
    Workshop organizers may duplicate the evaluation instrument, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION," (Appendix A) or they may design a form of their own. Using the no-carbon-required (NCR) duplication procedure simplifies the study. The validation conducted by the University of Delaware can be adopted or the organizers can conduct their own validation following the example given in the case study below.
    Gather Data
    After the workshop, trainees should complete the evaluation instrument, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION." They should identify the most hazardous activity in their work, then list three new safety practices that they can adopt to alleviate this hazard. The applicators should keep the bottom copy of the 3-copy NCR form and turn in the other two copies to the trainer. After 3-12 months of the pesticide use season, one copy of the form should be returned to handlers/ applicators with instructions to check those practices that they adopted. Confidentiality should be emphasized. Standard procedures for increasing the response rate should be followed as described in the case study below. The proportion of handlers/ applicators adopting new safety practices should be calculated.
    Analyze and Interpret Data
    The p-formula of Cocran (97) can be used to derive the 95% confidence interval on the proportion of applicators adopting new safety practices:

     

     
     
     
     
     
     
     

    Where:

    P = The proportion of applicators adopting the new behavior.
    t = 1.96
    f = n/N, the sampling fraction
    n = The number of applicators returning the evaluation form after marking.
    N = The number of forms sent to applicators for marking.
    p = s/n, the fraction complying
    s = The number of applicators who adopt new behaviors.
    q = 1-s/n, the fraction not complying
    Apply Results to PAT and WPS Training
    The results from this evaluation can be used to design more effective educational programs in the state. If handlers/ applicators do not follow through with their pledge to adopt new practices, then program coordinators should expand training in those areas.

    ADVANTAGES TO USING THE TRAINING EVALUATION FORM TECHNIQUE

    1. Handler/ Applicator buy-in because they choose the behaviors to adopt.
    2. Uses existing training events.
    3. Tracks individuals as well as groups.
    4. Cause and effect relationships can be correlated.
    5. Good learning tool for handlers/ applicators.
    6. Chance for second learning experience when form is returned to them and they are reminded of their commitments.

    DISADVANTAGES TO USING THE TRAINING EVALUATION FORM TECHNIQUE

    1. Requires mailing of post-treatment form.
    2. Requires printing of "official" form.
    3. Each state must either conduct an expensive and time consuming validation or use the validation conducted by the University of Delaware.

    CASE STUDY: PUT SAFETY INTO PRACTICE!

    In Delaware, initial PAT is conducted five times each year for 1 ½ days each time. Topics dealing with personal safety are: PPE use and care; safe transportation; storage; mixing and loading; spill clean-up; and symptoms of poisoning and first aid. Most trainees have applied pesticides before attending PAT. They have either worked under the supervision of certified applicators or they have applied only general use pesticides. Thus, most trainees have knowledge of the hazards of pesticide use.

    After PAT, trainees complete the evaluation instrument, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION." They identify the most hazardous activity in their work, then list three new safety practices that they can adopt to alleviate this hazard. Time for completing the evaluation form is allotted on the agenda. Each applicator keeps the bottom copy of his/her NCR form and turns the other two copies into the trainer. Forms similar to the one in this manual have been used in Delaware since 1991.

    From 12/94 to 9/95, 285 applicators attended PAT. However, as shown below, not all of them participated in the study:
    285 Attendees.
    - 47 Did not complete the form properly.
    - 66 Either did not obtained licensing, apply pesticides after training or reside in the state. 
    _________
    172 Participated in the evaluation by completing the evaluation form properly, obtaining licensing, applying pesticides after training and residing in the state.

    In January 1996, the 172 completed forms were returned to the individuals. Applicators were asked to check the practices that they used at least once since training. They were given instructions to return the form to the University of Delaware. A business reply envelope was enclosed. The mailing sequence was: heads-up post card, letter with original evaluation form, post card reminder, second letter with photocopy of evaluation form. In addition, an attempt was made to reach non-respondents by phone. Of the 172 forms, 135 were returned to the University. Of these forms, 133 indicated that those applicators used at least one new safety practice at least once.

    In March 1996, the evaluation instrument was validated through face-to-face interviews with a subsample of 101 of the 133 respondents who indicated that they had used a new practice. Enumerators hired by the Delaware Agricultural Statistician were contracted to conduct the interviews using a second instrument, "UNIVERSITY OF DELAWARE COOPERATIVE EXTENSION PESTICIDE APPLICATOR SAFETY INTERVIEW," (appendix B). The enumerator read each question to the applicator and recorded his/her answer on the interview form.

    After the completed forms were submitted to the University of Delaware, they were paired with the first form, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION." Pairs of forms, with names removed, were given to a panel of judges. Each behavior that an applicator indicated as having practiced at least once since training on the first form was searched for on the second form. If responses on the second form indicated that the applicator did practice that behavior, "seldom, sometimes, often or always," then a positive score was recorded. If the majority of such scores were found to be positive, then the evaluation instrument, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION," was determined to be a valid instrument. It is important to keep in mind that the validation process is concerned only with validating instruments, not people. Finding the instrument "valid" means that the instrument is designed in such a way as to allow individuals to express themselves accurately.

    The panel of judges found that each behavior that an applicator indicated as having practiced at least once on the first form could be found on the second form as either, "seldom, sometimes, often or always." Thus all scores were positive. The evaluation instrument, "PUT SAFETY INTO PRACTICE! PESTICIDE APPLICATOR TRAINING EVALUATION," was determined to be a valid instrument.

    The results of this study show that between 77.29 and 77.31% of applicators (133/172) who participated in the evaluation process practiced at least one new safety behavior at least once after training. No conclusion can be drawn about the 37 individuals who were non-respondents. Nor can any conclusion be drawn about the 47 individuals who did not participate in the study because they incorrectly completed the form or resided out-of-state. No conclusion can be drawn about the 66 individuals who did not obtain certification or did not apply pesticides after training. The next phase of this study will be to determine the proportion of applicators adopting new safety practices as routine.

    METHOD III: NEEDS ASSESSMENT

    PROGRAM PLANNING TO INCREASE ADOPTION OF TARGETED SAFETY PRACTICES IN THE STATE


    INTRODUCTION

    This method targets several pesticide safety practices for improvement through increased educational programing. Success of the educational efforts is measured by a reduction in training needs as indicated by the applicators on a needs assessment instrument.

    PROCEDURES

    1. Target Behaviors
    2. Plan Educational Events and Set Goals
    3. Design Needs Assessment Instrument
    4. Data Gathering Events
    5. Analyze and Interpret Data
    6. Apply results to PAT

    ADVANTAGES TO USING THE NEEDS ASSESSMENT TECHNIQUE

    1. Targets specific behaviors in state.
    2. Quick and easy to administer.
    3. No mailings.
    4. Uses existing training events.
    5. No validation required for a needs assessment.

    DISADVANTAGES TO USING THE NEEDS ASSESSMENT TECHNIQUE

    1. Handler/ Applicator can not choose safety practices to measure.
    2. Questions need to be designed in such a way that handlers/ applicators don't give the answer that you want to hear.
    3. Form will need to be revised as target behaviors are reached.
    4. Requires expensive printing of "official looking" form.
    5. Correlation between cause and effect not as strong as other methods.
    6. Tracks only groups not individuals.
    7. Must be used for more than one year to show trends

    DELAWARE CASE STUDY: READ THE LABEL AND WEAR PPE

    Educational activities for pesticide applicators take place through out the year in Delaware. Workshops, seminars, newsletters, and radio programs are among the many activities offered by Delaware Cooperative Extension and the Delaware Department of Agriculture. The Annual University of Delaware Pesticide Conference was designated the data collection event for the needs assessment. This conference is the major recertification event for pesticide applicators in the state. A survey instrument, "PESTICIDE APPLICATOR TRAINING EVALUATION INVENTORY," (Appendix C) was written addressing two areas for pesticide handlers/applicators: PPE and label understanding. A goal of 80% relative compliance was set. Base line data was collected from attendees at PAT for initial certification. Numbers of handlers attending the annual recertification conference who practiced the indicator behaviors were compared to the base line as shown in Table 5.
    Table 5. Numbers of applicator practicing indicator behaviors on a needs assessment.
    Question Numbers of applicators responding/ total number of applicators
    Base line Year 1
    The Label
    #1 reads label 3 or more times 105/166 60/120
    #2 reads 6 or more label parts 151/166 97/120
    #3 reads 6 or more label parts more than once 103/166 50/120
    TOTALS 359/498 = 72.1%  207/360 = 57.5%
    PPE
    #1, answer 2 150/166 113/120
    #2, answer 1 151/166 106/120
    #3, answer 1 144/166 93/120
    #4, thrown out
    #5, answer 2 74/166 48/120
    #6, answer 3 139/166 89/120
    TOTALS 658/830 = 79.3% 449/600 = 74.8%


    These results can be interpreted to mean that when applicators leave initial pesticide applicator training (base line), they have the correct attitude in regards to reading the label (72.1%) and wearing PPE (79.3%). However, with time, their practices in reading the label drops dramatically (57.5%). Proper use of PPE decreases slightly (74.8%). The trainers in this study decided to emphasize reading the label to certified applicators. Articles were written for the quarterly newsletter, "Pesticide Briefs." The needs assessment will be conducted again at the next Annual University of Delaware Pesticide Conference. If the proportion of applicators correctly reading the label approaches 80%, then another area of pesticide application will be chosen for the needs assessment.

    Acknowledgments

    I would like to gratefully acknowledge the assistance of Claude Bennett, Program Evaluation Leader, USDA/CSREES, Washington, DC, and Peter Nowak, Professor of Rural Sociology, University of Wisconsin-Madison. Both of these individuals gave freely of their time and knowledge to educate me in the science of evaluations. They helped me express my thoughts and encouraged me to field test my procedures. This manual would not have been possible without them. I would also like to thank the staff of the EPA Region III office for funding this project. Don Lott, Section Chief, encouraged my work at every turn. The University of Delaware College of Agricultural Sciences statistician, John Pesek, was patient and generous in his explanation of statistics. I thank him for reviewing the final draft. I would also like to thank the members of the Behavior Grant Advisory Committee who reviewed drafts of this manual and offered advice:

    Kerry Hoffman, Extension Pesticide Education Coordinator, Pennsylvania State University, University Park, PA.

    John Impson, USDA/CSREES National Program Leader, Health, Environmental and Pesticide Safety Education (HELPS), Washington, DC.

    Larry Olsen, Extension Pesticide Coordinator and Integrated Pest Management Coordinator, Michigan State University, East Lansing, MI.

    Grier Stayton, Pesticide Compliance Administrator, Delaware Department of Agriculture, Dover, DE.

    John Tacelosky, Certification & Training Coordinator, Pennsylvania Department of Agriculture, Harrisburg, PA.

    Larry Towle, Agriculture Specialist, Delaware Department of Agriculture, Dover, DE.

    And I would like to thank all the Pesticide Coordinators and State Lead Agency staff who attended the workshops that I conducted on evaluation over the last 4 years. Their suggestions transformed this manual from theory into practice.

    Literature Cited

    Andreasen, Alan R. 1995. Marketing Social Change. Changing Behavior to Promote Health, Social Development, and the Environment. Jossey-Bass Publishers, San Francisco.

    Canerday, T. Don. 1989. Evaluation systems. Abstract. In Proceedings, National Pesticide Applicator Training/Certification Workshop, March 28-30, 1989, Denver, Colorado.

    Cochran, William G. 1977. Sampling Techniques, 3rd Edition. John Wiley & Sons, Inc., New York.

    Ladewig, Howard. 1994. Evaluation Research and Cooperative Extension: Friends or Foes? Abstract. Presented at University of Delaware Cooperative Extension Professional Development Annual Conference, May 24-26, 1994.

    Moore, David S. 1993. Introduction to the Practice of Statistics, 2nd Edition. W.H. Freeman & Co., New York.

    Nowak, Peter. 1994. Evaluation: Theory and Practice. In Conference proceedings, Strategies to manage Pesticides, September 17-22, 1994, University of Wisconsin-Madison, Wisconsin.

    Rogers, Everett M. 1983. Diffusion of innovations. 3rd edition. The Free Press, A Division of Macmillan Publishing Co., Inc., New York.

    Salant, Priscilla and Don A. Dillman. 1994. How to Conduct your own Survey. John Wiley & Sons, Inc., New York.

    Wintersteen, Wendy K. and Bonnie Poli. 1991. Documenting PAT Successes. In Proceedings National Pesticide Applicator Training and Certification Workshop, April 16-18, Arlington, Virginia. Iowa State University Extension Service. 


    Appendix A

    PUT SAFETY INTO PRACTICE !

    PESTICIDE APPLICATOR TRAINING EVALUATION

    We want to know if Delaware's Pesticide Applicator Training is meeting your needs. Your response on this form will help us design better educational programs. Thanks for your help.

    In the space below:

    1. Describe the pesticide handling activity in your work that is the most hazardous to you.
    2. List 3 new safety practices you can use to prevent exposure during this hazardous activity.
    3. Keep the pink copy for your files. Give the yellow and white copies to the trainer.

    1. The most hazardous pesticide activity in my job is:



    2. Three new safety practices that will prevent pesticide exposure during this hazardous activity are:
    (Do not mark in this space)
    a. ____________________________________ [ ] Used at least once
    [ ] Adopted routinely
    b. ____________________________________ [ ] Used at least once
    [ ] Adopted routinely
    c. ____________________________________ [ ] Used at least once
    [ ] Adopted routinely

    Name 



    Address 

    City/State 

    Zip code 

    Phone 
    CONFIDENTIAL INFORMATION

    Applicator keep pink copy. Give yellow and white to the trainer. 5/96


    APPENDIX B

    UNIVERSITY OF DELAWARE COOPERATIVE EXTENSION PESTICIDE APPLICATOR SAFETY INTERVIEW

    PESTICIDE APPLICATOR'S NAME __________________________________

    INTERVIEWER'S NAME ___________________________________________

    INTERVIEW DATE _____________

    INTERVIEW CODE _____________

    DO YOU HAVE ANY COMMENTS ON THE UNIVERSITY OF DELAWARE COOPERATIVE EXTENSION PESTICIDE APPLICATOR TRAINING PROGRAM?
     
     

    WOULD YOU LIKE A COPY OF THE REPORT THAT DR. WHITNEY WILL WRITE ON THIS STUDY?
     
     
     

    THANKS FOR YOUR HELP!

    May, 1996



     
     

    UNIVERSITY OF DELAWARE COOPERATIVE EXTENSION
    PESTICIDE APPLICATOR SAFETY INTERVIEW

    CODE ____________________

    For each question, circle the number that applies:

     1 = Never
    2 = Seldom
    3 = Sometimes
    4 = Often
    5 = Always
    N/A = N/A

    APPLICATION EQUIPMENT, DRIFT CONTROL AND CALIBRATION

    1. HOW OFTEN DO YOU TAKE STEPS TO ENSURE YOUR APPLICATION EQUIPMENT IS THE CORRECT CHOICE FOR THE JOB?
    1 2 3 4 5 N/A

    2. HOW OFTEN DO YOU MAKE SURE YOUR APPLICATION EQUIPMENT IS CLEAN & IN GOOD WORKING ORDER?
    1 2 3 4 5 N/A

    3. HOW OFTEN DO YOU MAKE SURE YOUR APPLICATION EQUIPMENT IS CALIBRATED?
    1 2 3 4 5 N/A

    4. HOW OFTEN DO YOU MAKE SURE YOUR APPLICATION EQUIPMENT IS USED PROPERLY?
    1 2 3 4 5 N/A

    PROTECTING THE ENVIRONMENT

    5. HOW OFTEN DO YOU CONSIDER PROTECTION OF THE ENVIRONMENT WHEN USING PESTICIDES?
    1 2 3 4 5 N/A

    6. HOW OFTEN DO YOU TAKE MEASURES TO PROTECT OFF- TARGET ORGANISMS (HONEY BEES, BIRDS, FISH) WHEN USING PESTICIDES?
    1 2 3 4 5 N/A

    SAFE TRANSPORTATION, STORAGE, HANDLING & DISPOSAL

    7. WHEN YOU TRANSPORT PESTICIDES, HOW OFTEN DO YOU TAKE STEPS TO DO SO IN A SAFE MANNER?
    1 2 3 4 5 N/A

    8. WHEN YOU STORE, PESTICIDES HOW OFTEN DO YOU TAKE STEPS TO DO SO IN A SAFE MANNER?
    1 2 3 4 5 N/A

    9. WHEN YOU DISPOSE OF EXCESS PESTICIDES &/OR CONTAINERS HOW OFTEN DO YOU TAKE STEPS TO DO SO IN A SAFE MANNER?
    1 2 3 4 5 N/A

    10. WHEN YOU MIX & LOAD, OFTEN DO YOU TAKESTEPS TO DO SO IN A SAFE MANNER?
    1 2 3 4 5 N/A

    PESTS & PEST CONTROL

    11. HOW OFTEN DO YOU USE IPM (INTEGRATED PEST MANAGEMENT)?
    1 2 3 4 5 N/A

    12. HOW OFTEN DO YOU USE THE LOWEST RATE OF PESTICIDE POSSIBLE?
    1 2 3 4 5 N/A

    13. HOW OFTEN DO YOU IDENTIFY THE PEST BEFORE CHOOSING YOUR CONTROL MEASURE?
    1 2 3 4 5 N/A

    RECORD KEEPING

    14. HOW OFTEN DO YOU KEEP PESTICIDE APPLICATION RECORDS?
    1 2 3 4 5 N/A

    UNDERSTANDING THE LABEL

    15. HOW OFTEN DO YOU READ THE LABEL BEFORE APPLYING?
    1 2 3 4 5 N/A

    PPE AND SAFETY

    16. HOW OFTEN DO YOU MAKE SURE YOUR WORKERS ARE PRACTICING SAFE PROCEDURES?
    1 2 3 4 5 N/A

    17. HOW OFTEN DO YOU MAKE SURE THAT YOU ARE PROTECTED FROM EXPOSURE TO PESTICIDES?
    1 2 3 4 5 N/A

    18. HOW OFTEN DO YOU MAKE SURE THAT YOU ARE NOT EXPOSING OTHERS TO PESTICIDES?
    1 2 3 4 5 N/A

    19. HOW OFTEN DO YOU USE THE RIGHT PESTICIDE TO DO THE JOB WITH THE LEAST TOXICITY TO HUMANS?
    1 2 3 4 5 N/A

    20. HOW OFTEN DO YOU WEAR THE CORRECT PPE (PERSONAL PROTECTIVE EQUIPMENT)?
    1 2 3 4 5 N/A

    21. HOW OFTEN DO YOU CLEAN, MAINTAIN & STORE YOUR PPE PROPERLY?
    1 2 3 4 5 N/A

    22. HOW OFTEN DO YOU KEEP UP YOUR EDUCATION ON PESTICIDE SAFETY?
    1 2 3 4 5 N/A


    APPENDIX C

    PESTICIDE APPLICATOR TRAINING EVALUATION INVENTORY

    We'd like to know if Delaware's Pesticide Applicator Training Program is meeting your needs. Please answer the questions below and turn in this form at the end of training. This information will help us design better training programs for all of Delaware's applicators.

    This form is confidential -- don't sign your name.
     

    Thank you for your help,

    Susan P. Whitney, Pesticide Educator

    C O N F I D E N T I A L I N F O R M A T I O N

    THE PESTICIDE LABEL

    --Check all the answers that apply.

    1. When do you read the pesticide label?

    2. Check the following parts of the label that you read: 3. Which of the following parts of the label do you read more than once?

    PERSONAL PROTECTIVE EQUIPMENT (PPE)

    --Check the one best answer.

    1. How do you decide what PPE to wear?

    2. How often do you check PPE for rips, holes or other signs of wear? 3. When do you clean washable PPE, such as cotton coveralls? 4. When do you discard disposable PPE, such as tyvek coveralls? 5. How often do you have a respirator fit test? 6. Where do you keep your respirator and cartridges? C O N F I D E N T I A L   I N F O R M A T I O N

                                                                                                                                                   
    Certified                                                                    Educator's                                              Homeowner's
    Applicator's                                                             Information                                               Information
    Information

    home
    UD Home Page
    Dr. Susan P. Whitney
    swhitney@udel.edu