Wednesday, December 1, 2010

SAVE "Appeal to Suspend "APBEARS" At Berkeley


TO: EVC George Breslauer

CC: Vice Provost Sheldon Zedeck, Associate Vice Provost Angelica Stacy, Associate Vice Chancellor Shelton Waggener, Dean Andrew Szeri; campus offices

FROM: The SAVE Coordinating Committee

Evidence and testimony from across campus document that APBears was “rolled out” in a condition unfit for the use of staff and faculty. Moreover, it is potentially damaging to faculty who are now preparing dossiers for promotion review (see below). It is widely understood that some administrators in charge of this system knew about its deficiencies but suppressed this knowledge and refused to correct problems so as not to miss their “rollout” deadline. If they have not apprised you of this situation, we do so now and ask that you investigate it. Academic Senate Chair Fiona Doyle has already written to VP Zedeck and the chair of the Senate Budget Committee about this problem; VP Zedeck has responded to Deans and Chairs, but his response addresses only one of the system’s problems (the CSIR data).

We believe that faculty should be able to use an accurate and well-designed online academic personnel system. APBears, as it now runs, is not that system. We therefore ask the following:

1. Suspend APBears immediately and re-implement it only when a taskforce of campus department faculty and staff has reviewed its functionality and seen that the necessary and advisable changes to it have been made. Until then faculty should be permitted to use the present bio-bib case system.

2. Make certain that any new iteration of APBears contains clear statements about which data are supplied by external (non-faculty) sources, whether their errors can or cannot be amended, which information is supplied by faculty, and that faculty are responsible only for the information that they supply.

Here is a partial list of what is wrong with APBears:
1. It is full of procedural errors, glitches, and technical problems that are time-consuming to fix or work around. Some of these could be fixed with diligent review of a committee of faculty from all levels and across campus departments. We ask you to convene this group.
2. The data supplied by the Administration on teaching and mentoring are unacceptably inaccurate. One professor found that she was credited with only 25% of her teaching; another 400%. The requests for detailed data on student mentoring and employment are unnecessary and burdensome; faculty are not the HR office. The first problem cannot be fixed because it involves CSIR data; the second problem can be fixed by switching from a pull-down to a narrative system and eliminating several informational field requests.
3. Faculty are prohibited from correcting many kinds of errors in the system, and some apparently cannot be corrected by anyone. Despite these errors, faculty are being required to sign a statement that they have read and approved all information in their files, even though they cannot see some of it. Faculty should not be coerced to sign a document that they cannot fully review. This situation is indefensible and probably legally actionable. It could be partly fixed by prominent statements on the website that acknowledge clearly and fully that the accuracy of CSIR data is in doubt, note that data uploaded to the system at the time of its roll-out cannot be corrected, and that clarify that faculty are responsible only for the accuracy of statements that they upload to the system. This does not solve the problem of the proportion of CSIR data that is incorrect or unanalyzed, but it may improve the future accuracy of entered data.
4. Current estimates are that this system typically takes 20-40 hours longer to prepare than the traditional case procedure, and this does not include the “one-time” uploading of personnel data and historical material. This could be remedied by eliminating requests for some data, removing the pull-down menus, and allowing greater use of narratives uploaded by the faculty (see 6, 7).
5. Our faculty are incredibly diverse in the products of their research, the modes of their teaching, and the scope of their professional activities. The “pull-down” menus are time-consuming and do not encompass accurate descriptions or alternatives. These should be eliminated and a greater narrative freedom built in; otherwise faculty may as well just append accurate bio-bib statements and ignore the data fields.
6. There is no way to rank the importance of many activities; hence, a talk to a Cub Scout troop is featured as prominently as election to a national academy. Chairing a panel could mean a lot of work or none at all; there is no role for “convener” or “organizer.” The roles of authors in publications are also not adequately assessed. This could be fixed with greater narrative freedom and the abolition of pull-down menus.
7. The extent and kind of data being gathered represent an unreasonable burden on the faculty. Many of these data have to be entered in three different ways, which is redundant and time-consuming. Many categories do not accurately or adequately assess work done on a project or activity, and represent a “one size fits all” approach to professional activity and achievement. The redundancy and unnecessary fields should be eliminated.
8. It has not been thought out or clarified to the faculty how external referees will access case information in this online system of mixed and risked confidentiality. Currently the old “hard-copy” approach is being used. Why, then, the new system?
9. Department staff are spending an undue amount of time learning this system and trying to interpret it and fix its problems for faculty, at a time when they can least afford to do so, given additional job burdens related to staff cutbacks.

These problems are not simply a matter of system “growing pains” or “first time only” problems that are finding speedy remedy. They appear to be intrinsic and endemic. This system was put into place and mandated before it was ready. The entire faculty and staff should not have to be the guinea pigs for this. Let’s not repeat the errors of the BFS system.

We estimate conservatively that the extra time this system imposes upon the faculty will cost the campus well over $300,000 in faculty time this year alone. We cannot begin to estimate the loss of staff time. The argument that there will be time saved down the road is not sufficient justification for implementing a system (and APBears is by no means the only one) that has not been adequately reviewed, tested, and corrected before implementation by its principal users. Nearly every IT system on campus winds up making the faculty spend time entering data and negotiating systems that are not effectively designed to help research and teaching, but to make the jobs of administrators easier. In the end, however, this does not happen, because the systems – whether BFS, RES, or APBears -- are so flawed that both administrative and faculty time are engulfed by trying to negotiate or work around them.

We agree that an online system ultimately could be easier for the faculty to use. We recognize that this is considered the case on some other campuses. However, given the structural problems of the APBears system, it is clear that this system is not ready to be used or implemented. It could be, but only with further study and correction. Thank you for your consideration of this unwieldy and burdensome campus crisis.