Abstract: Fred Keller prepared two supplements for students to use in conjunction with Keller and Schoenfeld (1950), “Matters of History” and “Schedules of Reinforcement.” The latter was found in a to-be-discarded file of Murray Sidman’s reprints and other items after his death in May, 2019. After presenting evidence concerning the authorship of the supplement, the relation of the contents to the article to the teaching of behavior analysis to introductory psychology students in the course for which the Keller and Schoenfeld textbook was designed are discussed. The text is a remarkable example of the teaching of scientific principles and research methods - especially group and single-subject designs – because it is so rich with data derived from real experiments. It offered to introductory students facts and no fiction. It also is exemplary in its attention to relating scientific concepts to daily experience, a critical feature of scientific material directed to such introductorylevel students.
Keywords:Keller and SchoenfeldKeller and Schoenfeld,Principles of PsychologyPrinciples of Psychology,Columbia UniversityColumbia University,Course supplementCourse supplement,Schedules of reinforcementSchedules of reinforcement.
Resumen: Fred Keller realizó dos suplementos que se usarían en conjunto con el libro de texto Keller y Schoenfeld (1950), cada uno de ellos titulado “Cuestiones de Historia” y “Programas de Reforzamiento”. Posterior a la muerte de Murray Sidman, en Mayo de 2019, el segundo de ellos fue encontrado entre sus documentos, los cuales estaban por ser desechados. A continuación se presenta evidencia de la autoría del suplemento, así como una discusión de su relación con la enseñanza del análisis de la conducta a estudiantes de psicología, inscritos en el curso para el cual el libro de Keller y Schoenfeld fue diseñado. Debido a la abundante cantidad de datos derivados de experimentos que se llevaron a cabo, el libro en cuestión es un asombroso ejemplo de la enseñanza de los principios científicos y los métodos de investigación—especialmente con diseños de grupo e intra sujetos. Le brindaba a los estudiantes hechos y no ficción. El libro es igualmente excepcional en su interés por relacionar conceptos científicos con la vida cotidiana, una característica fundamental para la presentación de material científico a estudiantes de nuevo ingreso.
Palabras clave: Keller and Schoenfeld, Principles of Psychology, Columbia University, Suplemento de curso, Programas de reforzamiento.
Sección Especial
F. S. Keller’s “schedules of reinforcement” supplement to Keller and Schoenfeld (1950)
Received: 11 April 2021
Accepted: 15 April 2021
After Murray Sidman’s death in May of 2019, his son sent the present authors a manila folder containing mostly reprints that his father had accumulated over his long professional life. Among the items was a mimeographed document in a single-document binder. The document was labeled “Supplementary Notes,” with the author listed as F. S. Keller. The notes were in two parts, separately numbered but stapled together as a single document with a single cover page. Figure 1 shows a photograph of the mimeographed cover page, dated 1956-57. The first part, titled “Matters of History” (hereafter, the History supplement) was 21 single-spaced typed pages. This part is reproduced in the 1995 B. F. Skinner Foundation edition of Keller and Schoenfeld’s Principles of Psychology (1950; hereafter, K & S). The cover page in the 1995 edition of K&S lists the date of that supplement as 1958-59, apparently a later iteration of the History supplement that is part of the 1956-57 supplement. The second part of the latter supplement, which is the subject of this article, was not reproduced or mentioned in the 1995 Skinner Foundation edition. It is titled “Schedules of Reinforcement” (hereafter, the Schedules supplement) and is 11 single-spaced typed pages, followed by another five pages of cumulative records and hand-drawn graphs. Direct scans of the original mimeographed article appear on the Mexican Journal of Behavior Analysis website at http://rmac-mx.org/wp-content/uploads/2020/08/Original-scanned-handouts-for-KS.pdf. A typeset version of the supplement derived from a character recognition scanning of the original Schedules supplement appears as Appendix A at the end of this article. A narrative review of some of the Schedules supplement content follows an analysis of its authorship.

Cover page that appeared before the History supplement to which the Schedules supplement was attached. The two holes from the staple that held both supplements together is visible in the upper left corner of the page, as is the rust spot from the paper clip that at one time held the two supplements together. This same rust spot is visible on the reverse of the last page of the Schedules supplement.
Figure 1. Cover page that appeared before the History supplement to which the Schedules supplement was attached. The two holes from the staple that held both supplements together is visible in the upper left corner of the page, as is the rust spot from the paper clip that at one time held the two supplements together. This same rust spot is visible on the reverse of the last page of the Schedules supplement.
The question of authorship of the Schedules supplement arose during the course of researching its provenance. Catania (2020, this issue) observed the following:
The earliest course materials for PSYC 1-2 were written by Keller and Schoenfeld, as was a supplement devoted to the history of psychology. As lab procedures were modified over the years some new material was perhaps written by teaching assistants and others. For example, in his autobiography Keller refers to supplementary readings written by Donald Bullock (Keller, 2009, p. 212). Other indirect evidence is that K&S usually adhered to the usage that responses rather than organisms were reinforced (Catania, 1987), whereas a supplement on reinforcement schedules does not do so. Inconsistencies in the reinforcement language also can be found in the lab handouts detailed below. Also, we might assume Keller would not have created a cumulative record with occasional negative slopes (Appendix 1, p. 6); on the other hand those could be attributed to a shaky hand drawing directly on an uncorrectable mimeograph master. (p. 308)
Catania acknowledged that Keller authored the 1956-57 History supplement in the first italicized portion, but raises the question in the second italicized portion as to whether Keller was the author of the Schedules supplement because of a difference in expression concerning the reinforcement of responses versus organisms (see Catania, 1969), something he also finds in the laboratory manual. It also should be clarified that his subsequent comments above about the distorted – most likely hand-drawn – cumulative record do not apply to the Schedules supplement, but only to the laboratory manual in which it appeared. The following evidence strongly suggests that Keller was the author of the Schedules supplement.
1. Figure 1 shows that Keller is listed as the sole author of the supplement. This cover page was followed by the History supplement, with the Schedules supplement following, all in one binder and bound by the single staple as already noted.
2. Comparing the original typed mimeograph pages of the Schedulessupplement to personal letters typed by Keller to Skinner suggest that both were typed on the same manual typewriter, and furthermore, were typed by Keller himself. Keller’s personal letters share at least two idiosyncratic features with the Schedules supplement: (1) question marks are preceded by an extra space (an idiosyncrasy his son, John, speculated might be the result of his history as a Morse code operator; J. Keller, personal communication, June 22, 2020) and (2) the lowercase “w” is often below the line of the bottom of other letters. The History supplement seems to have been typed on a different typewriter because it lacks the misplaced lowercase w and generally has a “cleaner” appearance. In the latter, none of the four question marks is preceded by an extra space.
3. In the second italicized portion of the above quotation, Catania (2020, this issue) suggested that K&S “usually” described responses rather than organisms being reinforced, whereas the Schedules supplement does not adhere to this usage. Were this the case, it could be taken as evidence that Keller did not author the Schedules supplement. In the Schedules supplement, based on a simple word search there is reference to responses being reinforced 5 times and to organisms being reinforced 6 times. In K & S, also based on a similar word search of the text, the corresponding numbers are 18 and 13, respectively. In K & S, “organisms reinforced” includes “make his reinforcement,” “get his reinforcement,” “get his pellet or candy.” In addition, the locution “group was reinforced” and the “group was extinguished” both appear twice. Strictly speaking, a group of subjects is not an organism, but it is not a response either. Thus, the usage in the Schedules supplement is consistent with that in K & S.
4. In his autobiography, Keller (2009, p. 212) stated that he wrote to Robert Yerkes, describing “our textbook and a set of supplementary reading that Donald Bullock was preparing,” raising the question of whether Bullock might have written the supplement. In the first place, Keller identifies Bullock as “preparing” (not “written” as the Catania quote above indicates) the supplementary reading. The timing also is off. This quotation appears in a part of Keller’s autobiography describing activities around 1948-49, several years before the date of the Schedules supplement. Much of the research described in the Schedules supplement was reported in psychological journals between 1952 and 1956. Bullock received his Ph.D. from Columbia in 1950 and apparently left soon thereafter. A 1951 publication lists his affiliation as the University of Buffalo and by 1956 he was affiliated with Smith, Kline, and French pharmaceuticals. This leads to the conclusion that Bullock was not associated with the Psychology 1-2 course at Columbia in 1956, when the supplement is dated.
5. The Fred S. Keller Papers at the University of New Hampshire lists among its holdings the following manuscript:
Bullock, Donald H. (in collaboration with Fred S. Keller and William N. Schoenfeld). “Researches in the Science of Behavior.” 1949. Approximately 100pp., typed (Box 34, Folder 3)
This manuscript was unavailable to the authors due to the 2020 COVID-19 pandemic, but its title is different from the title on the cover page of the supplements, and from the title of each of the supplements.
The google books website contains the following entry:
COLUMBIA UNIVERSITY. Dept. of Psychology.
Researches in the science of behavior. Pt. 1-[2] By Donald H. Bullock in collaboration with Frederick S. Keller (and) William N. Schoenfeld. 2 v. illus. ©Donald Hartmann Bullock; 10ct49, 1Feb50; AA140144, 146844.
6. Another possible author of the Schedules supplement is Schoenfeld. In terms of the development and implementation of the Columbia introductory psychology course, Keller and Schoenfeld were joined at the hip. They co-authored K & S, of course; wrote the 1948 American Psychologist article describing the first iteration of the course together (Keller & Schoenfeld, 1948); appeared with Frick (Frick et al., 1947) as co-authors of the article describing the equipment used in the laboratory portion of the course; and were listed together on the Bullock piece described above. In light of this history of co-authorship on K & S course-related material, it would seem very out of keeping with Keller’s personal style of inclusion to exclude the name of someone collaborating on the supplementary notes and unfathomable that he would add his name to the exclusion of the real author.
Although some of the evidence in the above points is circumstantial, taken together it leads the present authors to conclude that Keller is the Schedules supplement’s author, just as he is credited as being the author of the History supplement.
[Page number references to the Schedules supplement are to the pages of the original scanned mimeograph version that appears on the Mexican Journal of Behavior Analysis website at http://rmac-mx.org/wp-content/uploads/2020/08/Original-scanned-handouts-for-KS.pdf The original page numbers are marked on the typeset version that appears in Appendix A at the end of this article by a bolded and bracketed ”p” followed by the page number (e.g.,[p. 2]).
The Supplement is dated 1956-1957, which falls between the publication of K&S in 1950 and the publication of Schedules of Reinforcement by Ferster and Skinner (1957) and Skinner’s (1957) outline of an experimental analysis of behavior. Thus, from an instructional perspective the Schedules supplement may be seen as a stepping-stone from K&S to Ferster and Skinner. The Schedules supplement is impressive for its level of explanation of schedules and the behavior processes at work. It is an amazingly sophisticated account for an introductory level course. It is considerably richer in coverage of schedules of reinforcement than was K&S. Indeed, even the term “schedules of reinforcement”, used as the title of the supplement, was new compared to K&S, which mentions “reinforcement schedules” only once.
The contemporary parlance related to “schedules of reinforcement” is well established for behavior analysts in the 21st century thanks to Ferster and Skinner (1957), a groundbreaking book that introduced vocabulary, methods, and knowledge related to schedules of intermittent reinforcement of operant behavior. Before that, K&S was a similarly groundbreaking text that introduced Skinner’s views and methods (Skinner, 1938) to a completely new generation of psychologists from the 1950s forward (see, e.g., Catania, 2020, this issue). K&S for decades thereafter was used as a textbook and, as has already been noted, was reprinted in 1995.
Regarding what is now termed intermittent reinforcement, K&S primarily dealt with what Skinner had called periodic reconditioning, which was contrasted with regular reinforcement where each single response is reinforced each time it occurs. Periodic reconditioning introduced some regular intermittency such as a time interval between reinforcement of responses (now called fixed-interval , FI, schedules) or a fixed count of responses between reinforcements (now called fixed-ratio, FR, schedules). K&S provided examples of periodic reinforcement and extinction after periodic reinforcement, with all six figures on periodic reconditioning in K&S showing results from the work by Skinner (1938). The main focus in K&S was on the temporal discrimination that developed on periodic schedules, where after training a pause in responding would form right after reinforcement. This resulted in scalloped cumulative curves for interval schedules and pauses followed by a high response rate for ratio schedules.
K&S interestingly realized that the periodic reconditioning schedules were somewhat limited to laboratory work. As they stated:
Outside the laboratory, regular reinforcement [now called continuous reinforcement – each response is reinforced] is by no means the rule, but neither is strictly periodic reinforcement. It is hardly to be expected that a schedule of any fixed interval or any fixed number of responses would be scrupulously honored by an environment so crowded with different events. We may well ask, then, whether the results of aperiodic reinforcement are the same as those of periodic or regular reinforcement. (p. 98-99)
Aside from mentioning some studies with aperiodic reinforcement with respondents and galvanic skin responses, K&S only mentioned what now is called a variable-interval (VI) schedule, but with no data other than brief reference to the “straight-line character of cumulative-response curves” (p. 100) and the great resistance to extinction built by such a schedule. K&S did, however, observe that “in very few spheres of human activity is reinforcement either regular or strictly periodic, and, in certain cases, the effect of this aperiodicity is dramatically impressive” (p. 101). They then listed several examples like the chronic gambler, although they did not refer to any ratio-type aperiodic schedules.
Research on schedules after Skinner (1938) was slow to develop, and Skinner’s own clandestine war-related work on pigeons controlling “smart” weapons in the early 1940’s was not known publicly until his (Skinner, 1960) description of project ORCON. Among other methods, that work involved development of variable ratio (VR) schedules. Maybe the first formal description of VI schedules was in Skinner (1950). Although not using that exact term, he did describe a mixture of time intervals between reinforcements including a “zero second” interval where the first response after reinforcement is reinforced. The intervals ranged from zero to 2 minutes with an average of one minute and occurred in mixed order within a session. A figure in the 1950 article shows straight-line cumulative records with no pauses after reinforcement, in stark contrast to the pattern developed on periodic schedules (FI and FR), where pauses after reinforcement are the norm. By the late 1950s, the main empirical sources of schedule work were Ferster and Skinner (1957) and Skinner (1957), where the terms periodic and aperiodic reinforcement were replaced with the now familiar schedule taxonomy and illustrated with experimental data. In Science and human behavior, Skinner (1953) described, without experiments or data, the four now-familiar schedules (FI, FR, VI, and VR). This new terminology appears also in the Schedules supplement, but not in K & S.
A main theme of the Schedules supplement is generalizability of basic findings. The text emphasizes schedule control of behavior of different species such as rats, pigeons, cats, monkeys, and humans. The K&S textbook was an important link between basic research, primarily Skinner’s, and general education in psychology. Generalizability of basic findings was therefore a critical component in the teaching of introductory psychology students. For example, Keller emphasized that a time discrimination develops on FI schedules (as evidenced by pauses in responding after reinforcement). This happens “with all the organisms thus far studied in the laboratory, including human beings” (Schedules supplement, p. 3). Similarly, Keller wrote that “[h] uman subjects provide fixed-ratio curves that are often indistinguishable from those produced by monkeys, dogs, rats, and other subjects” (Schedules supplement, p. 7).
The Schedules supplement has several interesting statements regarding scientific thinking. For example, K&S reported that on FI schedules each organism emits roughly the same number of responses between reinforcements independent of the value of the FI. Thus, one rat may make about 20 responses between reinforcements on FI 2 min and also 20 responses on FI 6 min. Keller wrote: “We now believe that this suggestion was based upon too little information; and that the number of responses per interval probably increases as the intervals get longer. At least two recent studies point clearly to such a conclusion” (Schedules supplement, p. 2). Students were to learn from this that suggestions or principles depend on empirical information that in turn depends on research, and that one could not progress too far with “too little information.” Keller articulated some issues related to experimental design as the cause of the initial erroneous conclusion. But then he emphasized that “the important thing, of course, is not the old error, but the new advance in our knowledge” (Schedules supplement, p. 2). In the same vein, when Keller asked how the size of the fixed interval in training would affect the response rate in extinction, he stated that “[c] ommon sense won’t give you the answer to such a question. Instead, laboratory information is desirable” (Schedules supplement, p. 3).
The Schedules supplement features a curious mixture of group design and single-subject design. Dinsmoor, reflecting on the research environment at Columbia University in the late 1940s to early 1950s, observed that:
It may come as something of a shock to those who became familiar with the experimental analysis of behavior only after [the Journal of the Experimental Analysis of Behavior] was founded, but almost all of the conditioning research during my stay at Columbia had been based on the traditional experimental design in which the mean performance of one group of subjects is compared with the mean performance of another group, treated differently in some way, and a statistical test is conducted to determine whether the results could have arisen by chance. The outstanding exception was Murray Sidman’s dissertation. (Sidman, 1953) (1990, p. 147)
During the 1950s, research designs in behavior analysis were still influenced by the types of “group” design from which Skinner departed in his early research. In the Schedules supplement, Keller described experiments using groups of subjects such that each group was exposed to one level of an independent variable; for example, six groups of rats each had one FI value and groups were compared regarding the average number of responses per interval (study by Wilson, 1954; Schedules supplement, p. 2). Yet, on the same page, Keller relates an experiment with a single-subject design. In it, each rat was exposed to different values of a light-termination schedule where a strong light was turned on. After x seconds, the first response by the rat would switch off the light for 1 minute. Each rat was exposed to 6 values of the fixed interval. Nonetheless, Keller reported the data as averages across rats with the FI value as the independent variable (Figure 2 in the Schedules supplement). The important emphasis by Keller was that the response rate was a function of the interval value whether responding produced food or light removal, or whether it was a group or single-subject design (Keller did not in the supplement articulate the difference in design for the emphasis on generalization of the research). Thus, this comparison reveals several layers of teaching generality to students regarding how schedules affect behavior both across consequences and methodological designs. Keller added that the light-removal schedule also could be an FR, as well as an FI schedule, as was shown in the single-subject design experiment with different FR values by Kaplan (1956), also described in the Schedulessupplement.
In spite of some cases showing data averaged across subjects, Keller also emphasized the individual subject’s behavior as a determinant of schedule effects. He described the effect of response rate before extinction as an important variable for extinction as follows: “The way an animal behaves during reinforcement is a good indication of what he will do during extinction. A slow, steady responder during training may give a ‘fixed-interval’ type of curve in extinction, and a high-speed responder in training may give a ‘fixed-ratio’ curve in extinction, in spite of the fact both animals may have been working on a fixed-interval schedule” (Schedules supplement, p. 4). To this he added: “The important thing, however, is not the schedule, but the kind of behavior exhibited while the schedule is in operation” (Schedules supplement, p. 4). This issue of the subject’s behavior at one point being a determinant of behavior at a later point was further illustrated by an early reference to work by Ferster and Skinner, about five years prior to their book on schedules of reinforcement (the reference to Ferster & Skinner, 1952, in the Schedules supplement). At the time, terminology was not fully developed, and what was described is now called a mixed FR FR schedule, where two different ratios alternate randomly within a session, such as FR 50 and FR 250. After considerable training, a pattern develops whereby after reinforcement there is a short pause, and if the next ratio is the large FR 250 then the animal stops after 60 of 70 responses and then responds again with another similar response run until the FR 250 is reached. As Keller wrote: “The bird acted, you might say, as if he had a crude sort of counter or inner clock by which it regulated its own pecking behavior. …… The cue for stopping a run, in each case, was suggested to be the emission of a certain number of responses, rather than by some change in the outside situation. There was, you might say, a ’feedback’ from the animal’s own behavior that led him to stop at a given point” (Schedules supplement, pp. 7-8). Keller also described experiments by Mechner, which apparently were early versions of later-published work (Mechner, 1958). Rats had to press one lever x number of times and then switch to a second lever that produced reinforcement if a sufficient number of presses were made on the first lever. Mechner studied the number of responses that exceeded the required number of responses. These experiments showed how precisely a rat’s behavior on one lever can function as the only stimulus for making a switch to a second lever.
This theme of when one aspect of behavior becomes a discriminative stimulus for another aspect of behavior was further developed in Ferster and Skinner (1957). Again, one has to appreciate Keller’s emphasis to his students in the Schedules supplement that behavior is related to the environment in an extremely complex manner such that one may not see the environment (the “outside situation” in Keller’s words) acting when behavior changes (as seen in the unusual pause patterns in mixed schedules). Instead, a history of training sets up certain patterns of behavior that somehow rely on other patterns. Interestingly, the conclusion that the experiments show that “an animal’s response may depend upon an inner ‘clock’” (Schedules supplement, p. 7) actually introduces controlling variables that are not independent variables, but are embedded in training history. These “inner clocks” as causes of behavior are not entirely different from what K&S warned about when they described the problems with “inferred internal” states or drives. As K&S wrote: “… the phrase ‘inferred internal state’ adds nothing to our knowledge of drive, because it denotes nothing beyond that which is contained within the observations themselves. It is, once again, a convenience of expression, and we might dispense with the term altogether if it were not for the effort involved in straining for technical purity” (K&S, p. 285). This complex issue of whether internal states are a result of experimental history or are independent causal agents remains a topic of discussion in contemporary behavior analysis (e.g., Eckard & Lattal, 2020).
The Schedules supplement also covers a comparison of VI and VR schedules. It describes an experiment later reported by Ferster and Skinner (1957) involving VI-schedule control of pigeon’s key pecking. In one case, key pecking was reinforced according to a VI 3-min schedule with intervals ranging from a few s to 6 min. On one occasion, the pigeon pecked steadily for 15 h, accumulating 30,000 responses with practically no pauses in responding during the entire period. Keller amusingly quoted the authors this way: “The reporters of this study (Ferster and Skinner) would seem to be guilty of understatement when they assert that ‘the control exercised by a schedule of this sort may be very great’” (Schedules supplement, p. 9). We may easily assume that Keller and Skinner communicated frequently and in detail about the research they were doing at that time (Keller, 2009).
In the supplement, Keller described a VR schedule as one based on random variation in the number of responses that must be made before reinforcer delivery. Response rates become very steady, as for VI schedules, with no pauses. There is, of course, a difference from VI schedules. In the words of Keller: “the ratio schedule leads as a rule to higher response rates than does the interval schedule. This is because a response that follows any break in an interval schedule has a greater likelihood of getting rewarded; whereas a pause during a ratio schedule never increases the chances of reward for the next response” (Schedules supplement, pp. 9-10). Keller also related the VR schedule to gambling in humans, something Skinner (1953) had done earlier.
The Schedules supplement ends with some examples of conditioning extreme rates of responding (what today are sometimes called schedules of inter-response time reinforcement, among other labels). The contingency difference between VI and VR schedules, articulated above, can be used explicitly to drive response rates up when reinforcement is contingent on emitting one response within a short time after a previous response and to drive response rates down by requiring a pause in responding between successive responses. As for other examples in Schedules, Keller provides actual data from experiments to illustrate the reinforcement of different rates of responding.
The Schedules supplement does not conclude with comments or even a summary. No theory or basic principles were outlined. The overall text, however, is a remarkable example for the teaching of scientific principles, research methods - especially group and single-subject designs - because it so rich with data derived from real experiments. In short, it offered to introductory students facts and no fiction. It also is exemplary in its attention to relating scientific concepts to daily experience, a critical feature of scientific material directed to these introductory-level students.
Appendix A1 (pdf) Schedules of Reinforcement supplement
Converted editable copy of the original mimeographed copy of the “Schedules of Reinforcement” supplement
Converted editable copy of the original mimeographed copy of the “Schedules of Reinforcement” supplement that is reproduced in its original mimeographed format at http://rmac-mx.org/wp-content/uploads/2020/08/Original-scanned-handouts-for-KS.pdf.
[Note: The cover page is as it appeared before the “Matters of History” supplement described in the accompanying article. This supplement on schedules of reinforcement directly followed the “Matters of History” supplement in the version received by the present authors. The cover page is included here simply as an introduction to the supplement, but should not be construed as being part of the original document, which appears at the above-noted web address. The actual cover page is reproduced in Figure 1 of this article. Bolded numbers in brackets (e.g., [p. 2]) demarcate the beginning of the indicated page in the original document. In this appendix, the line was reset at the beginning of each new original document page. Punctuation is as in the original document. Also, in this version, as compared to the original mimeographed version at http://rmac-mx.org/wp-content/ uploads/2020/08/Original-scanned-handouts-for-KS.pdf, the figures have been rearranged slightly so they are in numerical order. Specifically, Figure 11 was moved from the last page of the original document to its nominal place in order in this appendix.]
(The original “Schedules of reinforcement” supplement did not have a reference list. These notes list the names of all the experimenters cited in the supplement, along with the article we believe to be the cited reference.)
Berryman, R., & Mechner, F. (1956). A reference to a published article by these two authors for this year was not found. The reference perhaps is to research that either led to or later appeared in Mechner’s 1957 doctoral dissertation at Columbia University, published as Mechner, F. (1958). Probability relations within response sequences under ratio reinforcement. Journal of the Experimental Analysis of Behavior, 1 (2), 109-121. https://doi.org/10.1901/jeab.1958.1-109 Berryman’s suggestions and encouragement are acknowledged in a footnote to the latter work.
Boren, J. J. (1953). Response rate and resistance to extinction as a function of the fixed ratio. Unpublished doctoral dissertation, Columbia University. This research later was published as Boren, J. J. (1961). Response rate and resistance to extinction as a function of the fixed ratio. Journal of Experimental Psychology, 61(4), 304308. https://doi.org/10.1037/h0040208
Conrad, D. G., & Sidman, M. (1956). Sucrose concentration as reinforcement for lever pressing by monkeys. Psychological Reports, 2, 381-384. https://doi.org/10.2466/pr0.1956.2.3.381
Ferster, C. B., & Skinner, B. F. (1952). A reference to a published article by these two authors for this year was not found. The reference may be to a research grant report of some kind or perhaps to an early partial draft of material that would be published by Ferster and Skinnner in Schedules of reinforcement (1957). AppletonCentury-Crofts.
Kaplan, M. (1952). The effects of noxious stimulus intensity and duration during intermittent reinforcement of escape behavior. Journal of Comparative and Physiological Psychology, 45(6), 538-549. https://doi.org/10.1037/h0055989
Kaplan, M. (1956). The maintenance of escape behavior under fixedratio reinforcement. Journal of Comparative and Physiological Psychology, 49(2), 153-157. https://doi.org/10.1037/h0048735
Keller, F. S., & Schoenfeld, W. N. (1950). Principles of psychology. Appleton Century Crofts.
Sidman, M., & Stebbins (1954). Satiation effects under fixed-ratio schedules of reinforcement. Journal of Comparative and Physiological Psychology, 47(2), 114-116. https://doi.org/10.1037/h0054127
Wilson, M. P. (1954). Periodic reinforcement interval and number of periodic reinforcements as parameters of response strength. Journal of Comparative and Physiological Psychology, 47(1), 51-56. https://doi.org/10.1037/h0057224
Wilson, M. P., & Keller F. S. (1953). On the selective reinforcement of spaced responses. Journal of Comparative and Physiological Psychology, 46(3), 190-193. https://doi.org/10.1037/h0057705
[Commentary article , vol. 46, 305-352] http://dx.doi.org/10.5514/rmac.v46.i1.76976
The authors contributed equally to this article. We are indebted to John Keller for his invaluable help in determining the authorship of the “Schedules of Reinforcement” supplement (thank you, Watson), and to Charlie Catania for sharing with us an earlier draft of his reminiscence of “The course for which K &S was written” and material related to Keller’s autobiography.

Cover page that appeared before the History supplement to which the Schedules supplement was attached. The two holes from the staple that held both supplements together is visible in the upper left corner of the page, as is the rust spot from the paper clip that at one time held the two supplements together. This same rust spot is visible on the reverse of the last page of the Schedules supplement.