|
California
State Polytechnic University, Pomona
Prioritization
Recovery Planning Committee
Rumors,
Objections, Concerns, and Questions (ROCQ)
Revised 6/17/06 |
|
Back
to Documents Page | Revised
Criteria (March 30, 2006) | Progress
Report (April 19, 2006)
1.
“This looks like Strategic Planning revisited”
2.
"We are an FTE driven school and that will never change..."
3.
"What would happen if my department refuses to participate?"
4.
A department chair ...wonders if our data gathering will end up ignoring
the great things going on.
5.
"Dickeson points out the need to have the process supported by
the governing board..."
6.
“ In the past, institutional resource numbers were always used
in spite of those generated by the college. ... Will colleges be considered
as respected sources of valid information?"
7.
"What if we have a new program we want to initiate? Where is that
considered in the criteria?"
8.
“I am from a big college that doesn’t have representation
on the PRPC committee. I realize it is a charter committee, but I am
not sure I feel represented.”
9.
"How will the committee know they can trust the information submitted
on the instrument? How will they know if what they are reading is the
truth?"
10.
"What makes the committee qualified to decide what is quality and
what is not quality in areas other than their own discipline?"
11.
"Trend data can be misleading. How do you know if you are looking
at a trend or a just the upside of a cycle? Some disciplines run in
cycles."
12.
"Seems like this process is similar to what we did in the 90s.
Both seem to be sub-optimizing the campus."
13.
"Why don’t we just get rid of small programs? They are a
drain on our resources."
14.
"Is there any consideration for rewards other than increased funding?"
15.
"It seems like we are putting more emphasis on teaching and discouraging
research."
16.
"There appears to be a lack of communications regarding the prioritization
process at the faculty level. Information is not reaching the faculty."
17.
"Will there be an undue emphasis on satisfying external stakeholders
that will harm service programs?"
18.
"Money still seems to be the driving force around here. I am afraid
there will be undue bias toward programs that generate overhead money
from grants or do a lot of fundraising."
19.
"How will we guarantee that the prioritization process will not
threaten the concept of a broad education for our students?"
20.
"It seems like the use of statistics in analyzing matrices filled
with data has the danger of missing or losing important points."
21.
"Robert Dickeson has taken what some believe to be controversial
positions in some recently published papers. Aren’t we really
buying into Dickenson’s unpopular schemes if we continue with
prioritization?"
22.
"We know our program is at least 30 years old, but do not seem
to have information that pre-dates that. Is that enough to complete
the historical narrative in the template?"
23.
"When we complete Criterion #4, will members of the committee be
familiar with the terms and explanations used by the program? If you
are not, will you consult with the faculty for an understanding?"
24.
"What is the purpose of the college response? It looks like it
is counter to the purpose of the prioritization process."
1.
“This looks like Strategic Planning revisited”
There is a sense
in which Dickeson’s prioritization process can accomplish some
of what the strategic planning process accomplishes in the early stages
of assessment. There are some parallels. Dickeson’s approach
does not address implementation with any depth. That is left up to
campus leadership. One very significant characteristic of the Dickeson
approach is that the focal point of the process is identifying and
addressing “programs”…not departments, colleges,
or divisions. This subtle difference is worth reflection. In many
ways this approach aligns more comfortably with the Cal Poly Pomona
culture and the “shared governance” tradition of the academic
community in general. With thoughtful consideration, the results could
be helpful in the WASC reaccrediting process.
2.
From Fall Conference 2005, one long term department chair commented:
"We are an FTE driven school and that will never change. I don't
want to give up anything that is generating FTE, whether I like the
program or not. The Prioritization process will not change the FTE culture.
Budget is allocated by FTE no matter what the administration says. When
times are tough, the dean will look at your FTE, nothing else."
Similar concerns have been expressed: “FTES generates dollars
for the campus, so if it gets redistributed based on mission and merit,
the program rewarded may not generate the FTES to cover those costs”,
“Will the FTES driven nature of the CSU be changing?” and
“Does the Chancellor know we do not care about FTE any more?”
The President
made an initial response by saying that the CSU is not changing the
allocation of budget based on FTES, but we intend to change our culture
internally to be less FTES driven. The campus has no intention of
reducing FTES generated unless it is planned. The expectation is that
this process will actually produce FTES more easily and efficiently
by shifting some resources to programs that are in demand and are
well-managed, or to needed, effective support of the programs.
In reality, we
are a "pseudo-FTES" culture. We only use FTES to allocate
budget in one of the four divisions right now. Then within that division
there are a number of academic support programs that do not generate
FTES. Besides that, we do not equally divide up dollars based on FTES
the same for each college. Historically, we have used FTES targets
and actual FTES to make adjustments to college allocation. The targets
are historical numbers and not based on mission, quality or efficiency
in any consistent manner. In addition, FTE allocation methods vary
within the colleges.
3.
What would happen if my department refuses to participate?
The approach
taken within Academic Affairs is to allow each college to provide
input on the criteria indicators and later on the weighting of the
criteria. If someone chooses not to participate, then the final approach
will, unfortunately, be lacking their input. At some point Deans will
be asking each Department to provide information about their programs.
Deans will have to decide how to handle situations where departments
choose not to provide the information. Decisions will ultimately be
made about funding for each program on campus.
4.
A department chair asked if the criteria used would include great things
that the departments are doing that are not reflected in FTE. He said
he spends a lot of time making end-of-year reports that don’t
seem to have any use and wonders if our data gathering will end up ignoring
the great things going on as well.
There are several
areas where a program’s uniqueness, value, and strengths can
be mentioned. The Opportunity Analysis section can be used to show
how funding would be used to enhance the program’s success in
these and other areas.
5.
Dickeson points out the need to have the process supported by the governing
board, otherwise those parties that are unhappy with the outcomes can
run to trustees they have influence with and get the decisions reversed.
The committee
has been informed that the Chancellor and many other campuses are
aware of what we are doing and are watching with interest. The degree
to which trustees are aware of what is going on at Cal Poly Pomona
is not known to us at this time.
6.
“Without having valid campus data from institutional resources,
then the weight continues to fall on department chairs to track and
generate real numbers. In the past, institutional resource numbers were
always used in spite of those generated by the college. Many times the
data within the college was up to date compared to older stats from
the computerized data center. Will colleges be considered as respected
sources of valid information?”
We will use institutional
numbers as much as possible to reduce variation issues when comparing
programs to each other. This concern was also brought up at a focus
group on March 17 where department chairs from six different colleges
and schools were asked to review the critieria and indicators for
feedback on the usability of template. At their suggestion optional
fields were added to the template where programs could comment on
institutionally provided data if they felt it was appropriate.
7.
What if we have a new program we want to initiate? Where is that considered
in the criteria?
When first asked
this question, it seemed like the appropriate response was to say
new programs should be included in the “Opportunity Analysis”
section of the criteria where we ask, “If you had additional
funds, what would you do?” Perhaps a more appropriate response
should include that the normal process for getting an academic program
adopted would not change. However, at this point it seems realistic
to run a proposed program past the criteria to see if it would pass
muster next to other programs. Ultimately, the level any new program
is funded becomes an Administrative decision.
Perhaps it would
be good to ask what the college and/or division’s policies and
procedures are for adding programs.
8.
“I am from a big college that doesn’t have representation
on the PRPC committee. I realize it is a charter committee, but I am
not sure I feel represented.”
There apparently
was thoughtful discussion on this point when the committee was being
planned by the Executive Committee of the Academic Senate. The committee
is supposed to be setting up and implementing a balanced, fair process,
not creating a venue for a potential turf battle. Looking at what
other universities have done seems to indicate that the approach taken
here was appropriate.
9.
How will the committee know they can trust the information submitted
on the instrument? How will they know if what they are reading is the
truth?
We always have
this concern with proposals or self-reviews: program reviews, leave
proposals, grant proposals, and self-studies of all kinds. The committee
has already discussed this topic and plans to address it in our methodology.
10.
What makes the committee qualified to decide what is quality and what
is not quality in areas other than their own discipline?
Similar to the
previous question, the process needs to address this issue. How the
responses are evaluated and how the recommendations are reviewed needs
to be done in such a way that collective wisdom is employed. Asking
for college feedback on the criteria indicators, weighting process,
and allowing for responses to recommendations are additional ways
that misinterpretations in assessing quality can be minimized.
11.
Trend data can be misleading. How do you know if you are looking at
a trend or a just the upside of a cycle? Some disciplines run in cycles.
One reason the
PRPC has requested that the environmental scan be updated is to provide
some guidance about long term trends in various job markets within
the state. These forecasts are available and would be helpful if used
as part of a balanced review.
12.
Seems like this process is similar to what we did in the 90s. Both seem
to be sub-optimizing the campus.
The approach
we are taking is an attempt to prioritize programs for funding decisions
using balanced, mission-driven criteria. While admittedly suboptimal
in some respects, campuses that have undergone this approach report
great satisfaction with the results and success with programs that
were deserving of additional funding.
13.
Why don’t we just get rid of small programs? They are a drain
on our resources.
Program size
is not necessarily a reason for discontinuance.
14.
Is there any consideration for rewards other than increased funding?
If I get increased funding to support more or larger programs in my
department I am just going to be more overworked than I already am.
Without being
able to probe the question, it seems that increased funding can be
used to hire additional faculty, buy new equipment, and increase support
services that might be the sources of overwork and stress. Increased
funding can be used for release time, travel, research and faculty
development that could be helpful in reducing stress and increasing
the joy of working in academia.
15.
It seems like we are putting more emphasis on teaching and discouraging
research.
There are many
of kinds of research that are very appropriate for a comprehensive
university like ours. In Scholarship Reconsidered, Boyer discusses
the scholarship of discovery, scholarship of integration, scholarship
of application, and scholarship of teaching as being vital areas for
research. We have the rich opportunity to engage in all of these areas
well within the purview of our various programs.
16.
There appears to be a lack of communications regarding the prioritization
process at the faculty level. Information is not reaching the faculty.
The PRPC has
made presentations and/or met with the Deans Council as well as the
chairs of Science, CLASS, Engineering, Business, and Agriculture.
PRPC representatives have met with the entire faculty of CEIS and
with the University Council of Chairs and the Executive Committee
of the Academic Senate. The purpose of these meetings was to share
our progress, answer questions, and hear concerns. In November, each
college was asked to review the proposed criteria within the college
and provide feedback. The expectation was that all faculty would receive
a copy of the proposed criteria and have a chance to provide feedback.
The reality is that trickle-down communication on campus is very inconsistent.
We have recommended that direct communication with faculty be used
to explain and clarify the university-wide prioritization process
on our campus. As of this writing, it appears that these recommendations
are being adopted and the results will be forthcoming.
17.
Will there be an undue emphasis on satisfying external stakeholders
that will harm service programs?
Apparently there
was some perceived bias toward external stakeholders regarding the
planning or program review efforts of the past. That is not the case
in this process. Internal and External demand are equally important
in our campus culture and environment. Internal demand takes on several
forms, most notably service courses and General Education courses
and programs. The Senate Steering Committee has put together a trustee
committee which includes individuals with very deep understanding
of these issues and the need for appropriately recognizing service
related programs
18.
Money still seems to be the driving force around here. I am afraid there
will be undue bias toward programs that generate overhead money from
grants or do a lot of fundraising.
As of this writing,
there are 14 major criteria (not necessarily of equal weights) that
have been developed to encompass the university goals in the prioritization
process. Success related to grants and fundraising are portions of
three of these criteria. In reality, the process prevents the bias
of the prioritization of programs toward any one type of criteria
or type of program.
19.
How will we guarantee that the prioritization process will not threaten
the concept of a broad education for our students?
Probably the
most appropriate response to this concern is that the recommendations
of the PRPC will be delivered to the Senate and the President for
consideration. Final decisions about how to shift any funding or combine
or eliminate programs throughout the university can take into account
the broader issues and direction the campus community wants to go.
Also, the prioritization process does not undermine the curriculum
process, which is really the driving force in providing a broad education.
20.
It seems like the use of statistics in analyzing matrices filled with
data has the danger of missing or losing important points.
Each of the criterion
(of which there are 14 at this time) is composed of multiple indicators.
Some indicators have been developed that are qualitative and some
are quantitative. While statistical data are involved, the evaluation
and weighting process will provide balanced results. No single indicator
or criterion will have an overriding effect on a program’s final
ranking.
21.
Robert Dickeson has taken what some believe to be controversial positions
in some recently published papers. Aren’t we really buying into
Dickenson’s unpopular schemes if we continue with prioritization?
(Revised 4/30/06)
Dickeson’s book about the prioritization process provides a
basic “program” oriented approach that any campus could
use for deciding how to allocate budget cuts, realign current funding,
or pass out budget increases. Dickeson proposes that mission-driven
criteria be used to evaluate, rank, and eventually prioritize programs
and services for funding purposes. This basic approach was used to
design the charge
for the PRPC with autonomy to figure out the best way to accomplish
the task based on campus culture and institutions. As one committee
member puts it, "He provided a framework, not a mandate."
While Dickeson
was kind of the catalyst, the committee also reviewed many other authors
who have written on the subject of academic program planning, accountability,
resource allocation, state funding criteria, etc. For example we examined
and found very helpful the writings and research from Fred Volkwein
of Penn State, (Establishing academic priorities, Journal of.
Higher Education 49: 472-488). Also helpful was Honoring
the Trust: Quality and Cost Containment by William F. Massy (in
fact the cabinet, possibly including the leadership of the Academic
Senate actually had a phone conference with him). Professor Joseph
Burke has written extensively on the subject of accountability in
higher education. Two of his titles include "Performance Funding
for Public Higher Education", Jossey-Bass (1998) and Accountability
in Higher Education, Jossey-Bass (2004). We had Sally Johnstone on
campus speaking about some of the same issues. In a recent contact,
Professor Fred Volkwein said he would be willing to work with us when
we begin to look at the data.
In reality, besides
the basic prioritization approach of evaluating programs rather than
departments, disciplines, colleges, or divisions, the PRPC has carved
its own path on how to fulfill the committee’s charter (for
example, we have 14 criteria, as compared to Dickeson's recommended
10). Reviews of prioritization efforts at other universities continue
to show that each campus is customizing their approach to prioritizing
programs to match their campus culture, circumstances, and needs.
Since the chartering
of the PRPC, two articles written by Dickeson that address accreditation
and the cost of higher education have
been published. Dickeson’s recent articles and whatever ideas
he may have about accreditation and the cost of higher education are
not something that the PRPC has had on the radar screen. The PRPC
has no intention of being poster children for Dickeson or his ideas
on accreditation, faculty, or the cost of higher education. Dickeson's
book
on prioritization is a fast read and stands on its own merits.
All faculty and staff are encouraged to read the book and see for
themselves the rationale behind the basic process.
22.
"We know our program is at least 30 years old, but do not seem
to have information that pre-dates that. Is that enough to complete
the historical narrative in the template?"
The purpose
of this section of the template is to briefly provide the reader with
the context of the program within the fabric of the university. If
30 years is enough to do that, then it seems unnecessary to dig further
back.
For anyone who
would like to find out when a program actually began, probably the
best resource is the catalog collection in the Special Collection
Section on the 3rd floor of the Library. Below is a link showing their
hours:
http://www.csupomona.edu/~library/specialcollections/
Sometimes emeritus
faculty are available by email who can relate why a program was started
it if is not commonly known. All that being said, if a quick look
at the archives or inquiry with emeritus faculty doesn't produce any
useful insight, then it would probably be more effective use of time
to indicate the best collective historical information from the faculty
to state the context of the program and move on to developing input
for other narratives on the template.
23.
"When we complete Criterion #4, will members of the committee be
familiar with the terms and explanations used by the program? If you
are not, will you consult with the faculty for an understanding?"
Reasonable question.
We have already anticipated that we could be asking questions for
any template narrative that could use clarification, not just for
Criterion #4. We are not afraid to seek out the proper people and
ask questions.
24.
"What is the purpose of the college response? It looks like it
is counter to the purpose of the prioritization process."
The college level
response is supplemental information only. Early in the Spring quarter
the Deans expressed concerns about the Prioritization & Recovery
process that program information would be submitted without any context
from the college level. Dean Barbara Way brought that concern back
to the committee and we thought it was reasonable. (Our research showed
that some campuses who had implemented this process had solicited
parallel input for contextual purposes from colleges or deans, such
as Chadron State College, Nebraska). So, at the Monday, April 3, Dean's
meeting, Barbara Way announced that each college could put together
a college level review process and submit any recommendations, remarks,
or plans they wanted to at the same time the program information was
due (June 15). These recommendations would be available to our committee
for consideration, but were not a substitute for the PRP process.
Each college could choose to complete the process with existing organizational
structure or create something ad hoc just for this purpose. The only
requirement was that at a minimum we wanted a "college"
opinion about how they would rank the Criteria 4 spending requests
for their college. Everything else was optional. (Note: This was specifically
not to be a "Dean's" response.) Some colleges created a
plan and deadlines so they could complete the college-level process
before June 15. Other colleges waited until programs submitted Criteria
4.1 narratives to begin the review process which produced a serious
time crunch in meeting the deadline.
|