Improving the Comprehensive Editing Skills of Technical Communication Students

A report on the feasibility of a research study

Michael J. Freeman

Chicago, Illinois

Executive summary

This report examines the feasibility of conducting a research study into using specialized software to aid in teaching comprehensive editing to technical editing students.

In “An Analysis of Student Comments in Comprehensive Editing”, Albers and Marsella (2011)Albers and Marsella (2011) report that found that students of technical editing are not learning how to comprehensively edit documents. Instead students tend to focus on paragraph-level and sentence-level issues.

Students might be able to learn better comprehensive-editing skills by using specialized editing software that switches between modes of editing (global-level, section-level and paragraph-level). A research study in which the technical-editing skills of students were measured before and after doing editing exercises using the specialized software, would determine the effectiveness of the software as a teaching tool.

That research study would have value in three ways: 1) it will potentially help improve the skills of students of technical editing; 2) it would address what may be a hole in the current research; and 3) it will potentially help improve the writing skills of all students.

No currently available software was found that could be used for the research study. However, it should be possible to use functions that currently exist in the API for Writer, the word processor component of Apache OpenOffice, to create an extension for Writer that would meet the needs of the study.

Based on the effect sizes reported in a meta-analysis of the impact of computers on student writing skills (Goldberg 2003)(Goldberg 2003), a minimum sample size of 40 students (split into two groups) was calculated. Two groups of students are needed to create a quasi-control group, which reduces various threats to the validity of the results, and also simplifies the statistical analysis.

There is a gap in the current research about the effects of software on students writing/revising abilities. Specifically, there are few articles which illuminate the issue of high-level edits by students. Similarly, there is a gap in the available software packages intended to aid editors in high-level edits and writers in revising. Specifically there is apparently no software intended to aid in high-level edits.

The proposed research study would address both of those gaps.

Introduction

Comprehensive editing versus copyediting

While good copyediting is required for any form of professional writing, comprehensive editing is particularly important for technical communication because it “looks beyond words and sentences to the way in which readers will read and use the document” (p206, Rude 2011Rude 2011). Technical documents aren't merely read, they are used. Readers want to flip through them to quickly find a particular piece of information, they want to use the document's structure to help organize talks and coursework.

The major difference between comprehensive editing and copyediting is that, “instead of reacting line by line to the text” you assess “the document as a whole” (p206, Rude 2011Rude 2011).

But what aids are there for student's of technical editing to learn or practice comprehensive editing? While a cursory search for software that assists users in copyediting found eight commericially available software packages, a longer search for similar software focused on higher-level (“beyond words and sentences”) edits found no candidates.

This imbalance of aids for the nascent technical editor is in perfect opposition to the findings in “An Analysis of Student Comments in Comprehensive Editing”:

Results: Both effective and ineffective commenting habits were observed. Students were found to make a high percentage of paragraph-level comments and a low percentage of global and sentence-level comments. Most of the comments were rated as useful to an author. Looking at specific problem areas, most students missed commenting on four major problems within the text. The students seemed to be using a linear editing style of simply moving through the document from beginning to end, rather than using a top-down editing style or multiple passes. (Albers 2011)(Albers 2011)

The specific recommendations of the paper include: “Teaching students how to perform a comprehensive edit requires teaching a focus on analyzing a document's global structures.”

I do not propose a causal relationship between the lack of software aids for comprehensive editing and a lack of comprehensive editing skills in students. But the it may be possible to reduce the lack of comprehensive editing skills in students by eliminating the lack of software aids for comprehensive editing.

Again, it is generally understood that high-level editing and copyediting involve different mental processes:

“Analysis before editing discourages line-by-line reaction to errors and sentence structure. The line-by-line approach can work for copyediting, but it does not direct the editor's attention to big-picture issues of content, organization, and style nor to the document in use.” (Rude 2011)(Rude 2011)

The specialized software being proposed would aid students by directing their “attention to big-picture issues” by allowing the selection and movement of entire sentences and paragraphs within a document. It would lock out lower-level changes, and therefore prevent students from being distracted by “line-by-line reaction to errors”.

Summary of proposed research study

The subjects for the proposed research study would be students in a college-level technical editing course. The specialized software would be an extension to freely available word-processing software, Apache OpenOffice (www.openoffice.org).

The student subject would be split into two groups: one would use the word-processing software with an extension to do comprehensive editing assignments, and the other would use the software without the extension to do the same assignments. Having both groups use the same base software package to complete the same assignments, and only varying the use of the extension, is a simple (if not perfect) way to reduce the effects of other independent variables. (For more details on the imperfections of this approach see, e.g., “Quasi experimentation” (Cook 1990)(Cook 1990).)

The comprehensive editing skills of both groups would be tested at the beginning of the technical editing course, and then again at the end. The pre- and posttest of the two groups would be compared using appropriate statistical methods.

Three reasons to conduct the proposed study

There are three potential benefits to doing the proposed research study.

First: this study would evaluate a way to improve the learning of technical editing students, and therefore could result in better skills in program graduates.

Second: this study would fill an apparent gap in the current literature on the effects of software on the revising habits of students, and therefore could produce a publishable paper.

In 2003, Goldberg, et. al., published “The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002”. They did analyses for three outcome variables: quantity of writing, quality of writing, and number of revisions. Of the studies they found with enough information to for them to do a statistical analysis of the results, there were 14 about quantity of writing, and 15 about quality of writing.

But there were only six studies that Goldberg, et. al., could use for their meta-analysis that were related to number of revisions. And those studies were too small for them to do more than report merely qualitative results for the meta-analysis. So, there is a paucity of of published studies of the effects of software on student's revising habits.

Third: this study would likely have generalizable results, and therefore could lead to improvements in the writing skills of all students in the institute.

Multiple studies have found that the use of word-processing software improves the writing of students through the revision process. As explained by Goldberg et al., “revisions made by students using word processors resulted in higher quality writing than did students revising their work with paper and pencils.”

The tool that helps technical editing students copyedit, word processing software, also helps to improve the writing of all students through revision. Presumably, a tool that helps technical editing students perform comprehensive edits will also help improve the writing of all students who are writing documents of sufficient length to benefit from comprehensive editing-like revisions.

Methodology

Literature search provided recommendations for design of study

A literature search was done to find articles on the effect of computers/software on student writing/revising, with a focus on finding examples of empirical studies.

Though not comprehensive, the article “The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002" (Goldberg 2003)(Goldberg 2003) gives a good overview of research on software and the writing and revising/editing of students, and also provides an analytical summary of empirical results of the studies included in the meta-analysis. The effect sizes reported in it are used in the experimental design of the proposed study.

The chapter "Critical issues, current trends, and possible futures in quantitative methods" (Crawford 2001)(Crawford 2001) in the tome ironically titled “Handbook of Research on Teaching,” is an excellent overview of the application of empirical methods to evaluating educational programs. According to that chapter, despite the prevalence of research which measures the effectiveness of a given educational program by comparing the test results of students before they begin a program and after they complete it, “...we are on shaky ground when we try to infer causality from evaluations of single-groups only using pretest and posttest” (p134, Crawford 2001Crawford 2001).

Single-group studies also are biased to Type I errors (false positives): “The absence of a control or comparison group and the use of the same instrument for both pretest and posttest usually guarantees an apparent positive effect on student learning” (p134, ibid).

Using a “quasi-control group” (p23, Kirk 1995Kirk 1995) will avoid that error. The quasi-control group in the proposed study would follow the same course syllabus, do the same exercises, and be measured with the same instrument, but not use the specialized software. It would not be a true control group since there is no placebo for the specialized software.

However, a quasi-control group should eliminate the need to do a multivariate ANOVA analysis, such as was done in “Teaching College Composition with Computers: A Program Evaluation Study” (Bernhardt 1989)(Bernhardt 1989). In that study, each of the subjects were taking an introductory college composition course from one of 12 instructors. The students were split between the using-computers group and the not-using-computers group by course sections, and did not pass through the same course. “We did not urge teachers to attempt to teach the two courses in parallel fashion, with computers being the only variable.” The complexity of the analysis in that study can be avoided in the proposed study.

Using results from literature search to estimate sample size

Based on the results reported in, “The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002" (Goldberg 2003)(Goldberg 2003), it is reasonable to expect an effect size of 0.47. To get a statistical power of 0.8 at a confidence level of p≤0.05, given two groups, the study would need to have 20 students in each group. (This calculation was done with the “pwr” package (Champely 2013)(Champely 2013) for R (www.r-project.org).)

Output from R

>pwr.anova.test(k=2,​f=.46,​sig.level=.05,​power=.8)

Balanced one-way analysis of variance power calculation

k = 2

n = 19.55492

f = 0.46

sig.level = 0.05

power = 0.8

NOTE: n is number in each group

Search for currently available software

I was unable to find any currently available software that is intended to help with comprehensive editing. Most software intended to help with document revision is focused on low-level edits: checking spelling and grammar. There is some software that is intended improve “readability” and “cohesion”, which might be considered part of high-level editing. But none of them are involved in the process of editing; rather, they report the results of analyzing the text using “computational linguistic techniques”. And even the utility of that is unclear:

However, any rule-based effort to improve your writing is going to be wrong a significant part of the time. Readability and grammar tools can be refined, but only to a limited extent. Both have been available in office suites for over twenty years, and neither is anywhere near as reliable as an experienced editor. (Byfield 2009)(Byfield 2009)

In the end, it seems as if all document-revision aids are, in fact, attempts at document-revision automation. In contrast, the specialized software for this research study is intended to leave it up to the user to do the analysis of the document, and to decide what edits to make. The user will be helped by being forced to focus on high-level edits; the user will be kept in a global conceptualization of the document, and not distracted by copyediting-level issues.

Estimating the cost of software development

Extension for Apache OpenOffice: The intended functionality of the specialized software can be achieved through an extension for Apache OpenOffice's word processing application, Writer. The extension would facilitate users selecting and re-arranging documents on the sentence and paragraph-level. Functions already available in the API include:

gotoStartOfSentence

gotoStartOfParagraph

gotoEndOfSentence

gotoEndOfParagraph

gotoNextSentence

gotoNextParagraph

gotoPreviousSentence

gotoPreviousParagraph

It's unclear how difficult it would be to lock out lower level edits while using the extension.

Alternatives for improving student's editing skills

The proposed use of specialized software may be overkill. In “Can We Succeed in Teaching Business Students to Write Effectively?” Pittenger et al. (2006)Pittenger et al. (2006) reported that perhaps “writing assessment outcomes can be improved through the use of a grade incentive, no matter how minuscule, in combination with continuous deliberate instruction on the fundamentals of writing.” Similarly, a practically significant improvement in the comprehensive editing skills of students maybe achieve by changing technical editing courses to focus more on comprehensive editing.

However, changing technical editing courses to focus more on comprehensive editing would not help to fill the gap in the research on the effect of software on student's revision performance. Also, that change to technical editing courses could not be generalized into a way to improve writing instruction for all students.

Cost-Benefit Summary

The primary sources of cost in the proposed research study are the cost to develop the word-processor extension, and the cost associated with doing the pretest and posttest of the students.

The potential benefits of the research study are:

  1. Increased effectiveness of teaching technical editing.
  2. Increased effectiveness of teaching writing.
  3. Contribution to the literature.

Conclusions

The gap in the research in regard to software and the revising/editing of students would seem to be related to the lack of software specifically targeted to that task. Word processing software has always been intended to facilitate copyediting and low-level revision, and there is research into the impact of software on those areas.

Similarly there is research into the effect of spell-checking software e.g., “Does Spell-Checking Software Need a Warning Label?” (Galletta 2005)(Galletta 2005). But unlike checking spelling, or even grammar, the decisions made as part of comprehensive editing cannot be easily codified. While automated checks of spelling and grammer are possible, there is no apparent way to automate the comprehensive editing process. It follows that, as no high-level editing software exists, no research on its effects exist.

So, the proposed specialized software will both benefit students, and open up this area of research for exploration.

Recommendations

The following is an outline of steps for conducting the proposed research study.

  1. Contract development of word-processor extension for comprehensive editing
  2. Identify instrument for pretest and posttest
  3. Establish partnerships with programs offering technical editing courses
  4. Roll out extension to partners; train instructors on extension
  5. Coordinate with partners administration of instrument
  6. Conduct pretest
  7. Evaluate instruments and provide result to partners
  8. Conduct posttest
  9. Evaluate instruments and provide result to partners
  10. Preform statistical analysis of testing results
  11. Evaluate effectiveness of program

References

Albers, Michael J. and John F. Marsella. 2011. “An Analysis of Student Comments in Comprehensive Editing.” Technical Communication 58 (1): 52-67.

Apache Software Foundation. 2014. “Editing Text”. Accessed 14-Mar-2014. wiki.openoffice.org/wiki/Documentation/DevGuide/Text/Editing_Text.

Bacon, Donald R., and Elizabeth Scott Anderson. 2004. “Assessing and enhancing the basic writing skills of marketing students.” Business Communication Quarterly 67 (4): 443-454.

Bernhardt, Stephen A., Penny Edwards, and Patti Wojahn. 1989. “Teaching College Composition with Computers: A Program Evaluation Study.” Written Communication 6 (1): 108-133.

Byfield, Bruce. “OpenOffice.org: The Limits of Readability and Grammar Extensions.” 08-Sep-2009, retrieved 14-Mar-2014. www.linuxjournal.com/content/openofficeorg-limits-readability-and-grammar-extensions.

Champely, Stephane. 2013. “Package ‘pwr’”. cran.r-project.org/web/packages/pwr/pwr.pdf. Retrieved 12-Mar-2014.

Crawford, J., and J. C. Impara. 2001. “Critical issues, current trends, and possible futures in quantitative methods.” Handbook of research on teaching 4: 133-173.

Collier, Richard M. 1983. “The word processor and revision strategies.” College Composition and Communication 1983: 149-155.

Cook, Thomas D., Donald T. Campbell, Laura Peracchio. 1990. “Quasi experimentation” in Handbook Of Industrial And Organizational Psychology, eds. Marvin D. Dunnette, Leaetta M. Hough. Palo Alto, CA: Consulting Psychologists Press, 1990: 491-576.

Enos, Marcella F. 2010. “Instructional Interventions for Improving Proofreading and Editing Skills of College Students.” Business Communication Quarterly 73: 265-281.

Galletta, Dennis F., Alexandra Durcikova, Andrea Everard, and Brian M. Jones. 2005. “Does Spell-Checking Software Need a Warning Label?” Communications of the ACM, 48: 82-6.

Goldberg, Amie, Michael Russell, and Abigail Cook. 2003. “The effect of computers on student writing: A meta-analysis of studies from 1992 to 2002.” The Journal of Technology, Learning and Assessment 2 (1): 4-48.

Kirk, Roger E. 1995. Experimental design: Procedures for the behavioral sciences. Pacific Grove, CA: Brooks/Cole.

Pittenger, Khushwant K.S., Mary C. Miller, and Jesse Allison. 2006. “Can We Succeed in Teaching Business Students to Write Effectively?” Business Communication Quarterly 69 (3): 257-263.

Quible, Zane K. 2006. “Five strategies for remediating sentence-level writing deficiencies.” Business Communication Quarterly 69 (3): 293.

Rude, Carolyn D., Angela Eaton. 2011. Technical Editing. Boston.