Skip to main content

Research Repository

Advanced Search

Assessment in action: A study of lecturers' and students' constructions of BTEC national assessment practice in a college engineering programme area

Carter, Alan

Assessment in action: A study of lecturers' and students' constructions of BTEC national assessment practice in a college engineering programme area Thumbnail


Authors

Alan Carter



Abstract

This research examines the nature and form of Edexcel’s BTEC National assessment policy and practice, as found within a small college Engineering Programme Area. The study investigated the salient influences and considerations underpinning both the explicit and implicit lecturer assessment constructs. The backwash effects of these constructs are considered, and how these impact on lecturers’ micro-level classroom practice, and on students’ engagement with assessment. This study also considers the effect assessment has on preparing students for progression from BTEC National programmes.

BTEC National qualifications of the 2000s have their origins in the 1970s Technician Education Council’s programmes, founded on the recommendations of the Haslegrave Committee’s Report (Haslegrave, 1969). Although BTEC programmes have evolved over the past four decades, the central tenets of Haslegrave, that of unitised, teacher-assessed, broken-up summative assessment, still underpin BTEC National assessment of the 2000s. Current BTEC units are criterion-referenced, and employ formative assessment as an integral aspect of the educational ethos of the qualification.

The research design involved a single site case study of assessment-in-action within a small programme area offering BTEC Nationals in Electrical and Electronic Engineering and in Manufacturing Engineering. This study used an interpretative approach, based on semi-structured interviews with seven lecturers and thirteen students during academic years 2006-2008.

Findings suggest BTEC assessment practice relies significantly on the integrity of the lecturers, who construct their assessment practice by accommodating and balancing various external and internal requirements and influences placed upon them. It is through the programme area community of practice that notions of standards evolve, these being significantly influenced by cultural considerations, which impact on all aspects of assessment practice.

This study finds an ethical departmental ethos in which all students should pass, and an assessment regime implicitly designed to aid student retention and achievement, but from which emanates a focus on criteria compliance. This tends to produce assessment constructs encouraging instrumental learning, where students’ achievements can be based on incremental improvement of the same assessment through multiple attempts, and where the potential for developing learning is diminished as formative assessment becomes conflated with summative intent. Both the assessment regime and the type of learning implicitly encouraged, has the potential to hamper some students’ preparedness for progression from the BTEC National programmes.

Based on the findings of this research, considerations and recommendations are offered, both at the macro level of BTEC policy and at the departmental programme area micro-level of classroom practice, with the intention of enhancing students preparedness for progression from the National programmes. The study concludes that, despite radical changes in technician assessment practice having occurred since instigation of the Haslegrave recommendations, concerns emanating from assessment practice of the 1950s and 60s are still present within modern-day BTEC assessment, a case of plus ça change.

Citation

Carter, A. Assessment in action: A study of lecturers' and students' constructions of BTEC national assessment practice in a college engineering programme area. (Thesis). University of the West of England. Retrieved from https://uwe-repository.worktribe.com/output/949201

Thesis Type Thesis
Publicly Available Date Mar 28, 2024
Public URL https://uwe-repository.worktribe.com/output/949201
Additional Information Additional Information : ** This is an EdD thesis ** EXECUTIVE SUMMARY Thesis reference: CARTER, A. (2012) Assessment-in-action: a study of lecturers’ and students’ constructions of BTEC National assessment practice, in a college Engineering Programme Area, EdD Thesis, School of Education. Bristol, University of the West of England. What the thesis is about? This research examines the nature and form of BTEC National assessment policy and practice within a small Engineering Programme Area. This was achieved by investigating the salient influences and considerations underpinning both the explicit and implicit lecturer assessment constructs, and how these impact on both lecturers’ micro-level of practice and on students’ engagement with assessment. This study also considers the effect assessment has on preparing students for progression from the National programmes. Research Questions 1) What are the salient influences impacting on the development, implementation and effectiveness of modern day BTEC National assessment practice in engineering? 2) What is the nature and form of ‘BTEC National assessment practice’ at the micro-level of classroom engagement? 3) How do the constructions of assessment affect students’ preparedness for progression to employment or higher education? 4) What can be learned from this study that will enable assessment to facilitate students’ learning and improve their preparedness for progression? College context This research involved a case study at one College of Further and Higher Education, serving a small, remote coastal community. The College provides a wide range of educational opportunities from non-vocational leisure and specialist interest classes to further education, full-time degree and post-graduate programmes. During the period of this research, the College had 817 students enrolled on its full-time courses, of which 109 were studying engineering courses and of those, 35 students were enrolled on BTEC National Engineering Programmes. The College concentrates on vocational education, although also provides access to Higher Education programmes through association with several UK-based universities. It should be noted that the College is not ‘incorporated’ and not subject to the same funding arrangements as colleges in England and Wales, which receive funding essentially based on ‘payment by results’ (Wolf, 2011, p. 60) relating to annual student retention and achievement rates. The College in this study receives funding purely based on the number of students enrolled by November of each academic year. What are BTEC Nationals? BTEC Level 3 Nationals are semi-educational, semi-vocational courses (Wolf, 2002, p. 90) offered by the awarding body Edexcel/Pearson. They differ from A-levels in that they are teacher-assessed, and designed to have a more or less specific vocational orientation relating to a range of vocational sectors (Edexcel, 2010, p. 2). BTEC Nationals are valuable in the labour market (Wolf, 2011, p. 33), well-recognised and widely accepted by higher education for entry onto degree courses, especially for courses in similar areas (Wolf, 2011, p. 50). The current BTEC National programmes, with their unitised structure and internal assessment format, are founded on recommendations from the government-instigated Haslegrave Committee (Haslegrave, 1969) related to technician education dating back to 1969. The forerunner qualifications of the modern-day engineering BTEC Nationals, were introduced in the early 1970s under the auspices of the government-instigated Technician Education Council (TEC). However, technical education has a very long history and the origins of the ‘National’ qualifications can be traced back to the ‘invention’ of the Ordinary National Certificate (ONC) and Diploma (OND) after the First World War (Foden, 1951, p. 38), administered by various regional Joint Committees. Certificates or diplomas were awarded on success of a group of subjects, with assessment primarily based on ‘once-for-all end-of-session examinations’ (Bourne, 1984, p. 747). Failure in one exam constituted failure-in-all, with the entire year of study repeated. This external exam assessment system was considered by the Haslegrave Committee to be exacerbating the significant student wastage and failure rates of the time. Exams were also considered a ‘confining influence on the teaching’ (Haslegrave, 1969, p. 23, para. 131), liable ‘to give a false picture of the student and his [sic] real achievement’ (Haslegrave, 1969, p. 42), and an unsatisfactory way of testing the ability of technicians (Haslegrave, 1969, p. 43, Bourne, 1984, p. 747). Haslegrave proposed a move away from extensive dependence on formal examinations as a main or only measure of a student’s performance as a technician, and instead emphasised teacher-based internal assessment, proposing changes in frequency, and use of a variety of methods such as, written papers, practical or oral examinations and course and project work (Haslegrave, 1969, p. 42). From inception to the 1960s, the part-time ONC was the predominant mode of study forming a system of certification for apprentices and other students in Engineering from the 1920s for over fifty years until the formation of TEC in the 1970s and BTEC in the 1980s. The BTEC National Certificate, through its long historical development, still forms an essential element of the education and training of technician engineers. Within the modern engineering apprenticeship programmes, the BTEC National satisfies the knowledge-based requirements underpinning NVQ workplace-based competences, and provides additional knowledge to facilitate progression to HE or higher levels of working (SEMTA, 2010, p. 26). All assessment for BTEC Nationals is criterion-referenced, as opposed to norm-referenced, and based on the achievement of specified learning outcomes (Edexcel, 2002b, p. 10) stipulated in the form of assessment criteria. BTEC units also contain contextualised grading criteria individually graded as ‘merit’ or ‘distinction’. To achieve a pass grade for the unit, learners must meet the assessment criteria stated in the specifications (Edexcel, 2002a, p. 10, Edexcel, 2010, p. 11). Since the late 1980s, formative assessment has underpinned student learning and achievement, with current BTEC National programmes employing formative assessment as ‘integral to the educational ethos of the qualification’ (Torrance et al., 2005, p. 14). Research Design The research design involved a single site case study of ‘assessment-in-action’ within an Engineering Programme Area at a college. A cohort of BTEC National Diploma students (n = 13) were interviewed in their first and second year of study from 2006-2008. Lecturers involved with delivery of this programme were interviewed in July 2007 (n = 7). All interviews were semi-structured, undertaken on a one-to-one basis within college classrooms and transcribed. The interview transcripts formed the predominant data gathering activity, although an array of documentation such as lecturer-devised assessment instruments and associated students’ scripts were also accessed and reviewed. Lecturers’ transcripts were initially analysed deductively using themes from the literature whilst students’ transcripts were analysed inductively, although this process became blurred as it unfolded. What was found? 1) The salient influences on assessment A salient influential external structural constraint on lecturers’ BTEC assessment practice when developing summative assessment instruments, was the requirement for a student to achieve all learning outcomes, specified through assessment criteria, to pass a unit. Not achieving one criterion resulted in a student failing a unit and possible failure of the programme. Another main influence was the Awarding Body verification procedures, through which Edexcel endeavoured to achieve national standards. External verification was undertaken by Edexcel appointed examiners reviewing a sample of assessments, whilst internal verification was a peer-review system undertaken by departmentally appointed lecturers, reviewing all assessment material before being issued to students. Both verification systems also reviewed a selection of students’ scripts. However, this study has found the above Awarding Body requirements were viewed in the context of the culture of the learning site, and lecturers’ perceptions of students’ traits, such as academic background, abilities, and dispositions to learning. This affected all aspects of the constructed practice including choice of assessment method, the range and depth of content covered, the academic level set, and approaches to offering opportunities for re-assessment. 2) The nature and form of assessment The Engineering Programme Area assessment practice was founded on a co-construction through which lecturers endeavoured to accommodate the various explicit and implicit influences placed upon them. Within the programme area there was an underpinning ethical ethos from lecturers that all students enrolled should achieve the qualification. In the context of the above stated influences and requirements, lecturers used the well-documented ambiguity and subjectivity inherent within criterion-referenced assessment, to set standards and practices which accommodated their cultural perceptions of students with regards expectations of ability and approach to study. Lecturers set a standard that all students could achieve by regular attendance at lectures, and where limited work was required to be undertaken outside of the classroom. A referral system, used to support students if requiring additional attempts to pass an assessment, also formed an integral aspect of assessment practice. Referrals tended to be developed on a one-to-one basis, dependent on student needs and circumstances. This system aided students’ achievement, particularly for the weaker ability students, by offering multiple attempts at an assessment, primarily underpinned by on-going, undocumented, verbal feedback between lecturer and student. Formative assessment in this context became conflated with summative assessment and used with an implicit emphasis on ‘closing the gap on the criteria’ not developing learning. Thus, the departmental ethos of ‘all should pass’ tended to see lecturers limit the demands of assessment in both an academic and work-loading sense, which had the side-effects of encouraging students to adopt, or remain at, a superficial level of engagement within their studies. These assessments’ constructs benefitted student retention in the programme and so could enhance student achievement rates but in so doing, limited the challenge of learning. This had the potential to reduce understanding and proficiency of the weaker ability students, who could achieve through incremental improvement, and of all students who could achieve through instrumental learning. The concerns of the type of learning implicitly encouraged through the constructed assessment practice, was its potential to hamper students’ preparedness for progression from the National programmes. 3) Preparedness for progression from the National The three main progression routes for National students were to: (i) undergraduate study, (ii) Higher National technician programmes or (iii) entry into employment. The best supported progression path from students’ perspectives was to Edexcel Higher National study at the College, on which the same lecturers taught, using the same range of assessment methods and where the same approach to referral, with on-going feedback and multiple opportunities to pass, was available. Entry to engineering undergraduate programmes often required students to obtain higher grade awards across the majority of National units studied. This research found lecturers associated the integrity of the National qualification with the higher grade awards, and unlike achievement of the pass grade, lecturers tended not to support continual opportunities to achieve merit and distinction criteria. For higher grades, lecturers expected students to show greater commitment and autonomy, requiring them to be self-motivated and independent learners, and so developing study traits expected to underpin university study. However, there were concerns that the lecturers’ constructions of assessment practice did not prepare students well for progression to university study. Firstly the use of broken up, summative assessment using methods such as open-assignments and open-book tests did not encourage or require students to undertake revision and so did not develop their ability to revise. Revision remains an important ability for students studying engineering at university, where end-of-year closed-book examinations are still a prominent feature of assessment practice (McDowell, 2004). Another concern relates to BTECs competence approach to achievement, where all assessment criteria have to be achieved to pass a unit. National students are allowed, and come to expect at a pass grade level, multiple attempts at the same assessment, which is supported by interactive, personalised lecturer feedback. University study is primarily a ‘one shot’ opportunity, numerically marked, with limited recourse to referment. These differing methods of measuring achievement again raises concern over students’ awareness of, and preparation for, traditional-type examinations. Both progression to Higher National and undergraduate engineering programmes, will require study of mathematics at an advanced level. As this study found, the constructed assessment practice allowed students to make incremental improvement through an assessment, and implicitly encouraged the use of instrumental learning, where classroom work was closely aligned with assessment questions. These characteristics of the BTEC assessment practice allow students to progress through a unit without attaining the commensurate proficiency, and so not developing the necessary understanding and skills required to underpin progression to higher-level study. How BTEC National programmes prepared students for progression to employment was more difficult to assess, as technician employment is so diverse. Methods of assessment used such as open and closed-assignments with timescales often spanning weeks and not hours, had the potential to provide for authentic assessment opportunities. Within these methods, students were allowed access to reference literature, with tasks often vocationally orientated, allowing relevant, industrial-based scenarios to be simulated. Such methods had the potential to encourage development of research-based skills and application of knowledge and techniques in solving contextualised problems. Even where formal tests were used, they were open-book so all assessment methods used could assess organisational skills and application of understanding and techniques rather than rote-learning and memory recall. Of the above problems highlighted with progression to Higher National or university related to assessment practice, the most significant problem for impact on technician engineers’ employment progression is the unlimited resubmission allowed, which is not representative of industrial practice. This aspect of the assessment practice can make students, even very capable ones, complacent in their approach to and performance of, set tasks, something not tolerated in a ‘right-first time’ ethos of modern day industry. Also, for students requiring multiple opportunities and incremental improvement, this does not guarantee performance is commensurate with criteria achieved, so employers cannot be sure of the capabilities of students despite what appear transparent learning outcomes stating students’ unit-related achievements. The conclusions of this study Despite the transparent and rigorous external requirements and constraints placed on lecturers through Awarding Bodies’ use of criterion-referenced assessment and verification processes to set national standards, the lecturers in this study had significant responsibility and control over the micro-level of assessment practice. It was their integrity within the context of a local community of practice through which meaningful judgements about standards were made (Torrance et al., 2005, p. 3). Awarding Body verification procedures operate at an aesthetic level of validity, confirming surface and content validity of samples reviewed, but the use of incremental improvement and an implicit emphasis on instrumental learning can reduce the construct validity of assessment, that is, being confident assessment measures the knowledge, skill, or ability intended. This study has also found lecturers are significantly focussed on students achieving a pass and so achievement of the respective unit assessment criteria. Lecturers conflate summative and formative assessment at the micro-level of classroom practice to maintain students’ participation in the programme and allow more to succeed, but this can detract from the depth of student learning. A more holistic finding from this study suggests the BTEC assessment regime of the 2000s, which is criterion-referenced, uses teacher-based broken-up summative practices, employs a variety of authentic assessment methods underpinned by formative assessment practices offering multiple referral opportunities, generates similar concerns to the Joint Committee National courses of the 1960s, which employed end-of-year, externally set, grouped examinations, allowing little or no referral. Lecturers in this study still found assessment a ‘confining influence on teaching’, and the assessment did not necessarily indicate a student’s ‘real achievement’. Thus, there is therefore continuity as well as change in the issues and concerns underlying BTEC assessment practices over time, or as Torrance et al. (2005) have observed: …… the more important general finding here is that no approach to or method of assessment is immune from distortion when too many consequences ride on the results. (Torrance et al., 2005, p. 82) Considerations and recommendations For BTEC policy: • A realisation that awarding body quality control procedures assess face and content validities of assessment, and that some students’ level of learning, understanding and competence may fall short of these surface validities despite their apparent achievements. This offers a reason for variability of standards from different learning sites found to occur on progression to university. • An acceptance that, as found by James and Biesta (2007), learning sites have their own learning cultures and within that, their own assessment cultures which are based on social practices. It is the community of practice and the integrity of the lecturers that set and maintain standards, against a backdrop of external and internal influences. • Centrally set policy should support and develop local assessor judgements through increased use of exemplar material and possibly a centrally set data bank of assessments allowing students a choice of assessment methods. This research also suggests that such reference material could encompass use of formative assessment prior to summative assessment, as stated within Edexcel’s assessment guidance (Edexcel, 2006). Such examples could include written and verbal feedback given to students in referral situations to improve their performance; how feedback from summative closed-book or open-book tests can be effectively re-used to aid learning; in-class use of formative feedback whilst students are working on assignments and appropriate use of coaching. • Edexcel should offer staff development sessions for centres relating to the use of formative assessment specifically related to BTEC Nationals to illustrate its integration into classroom practice and relationship to summative assessment, and explain and reinforce use of exemplar material. • Edexcel should encourage local centres to develop links through which shared practice related to formative assessment activities can be forthcoming, and from which, best practice may evolve that provides for greater consistency of approach and coherency in standards across centres. • A consideration for Edexcel is the use their competency-orientated approach and the requirement for all assessment criteria to be achieved to pass a unit. This system contrasts with the numerically assessed compulsory and HE sectors, where percentage marking is the used within criterion-referenced assessment. With the possibility of incremental approach and instrumental learning attributes of BTEC assessment practice, does current policy enhance students’ learning over traditional numerical marking and a 50% mark required to achieve a pass? For Programme Area practice: • A departmental policy relating to both summative and formative assessment practice should be implemented which endeavours to produce a common, across-unit approach to referral. • The setting-up of an academic board to share judgements, review and approve repeated referrals. • Increase lecturers’ understanding of formative assessment and how this can be pragmatically implemented into classroom practice and to develop feedback encouraging reflection and deep learning. • Homework should be used with formative intent, aiding feedback about students’ progress and development to both lecturers and students. • Explore other approaches to assessment such as the use of peer and self-assessment to encourage students to develop a sustainable approach to learning and prepare them for the various progression paths from the course. • At the time of this study, no closed-book tests were used by any lecturer within their assessment practice. Limited use of closed-book testing should be considered to encourage students to prepare for assessments and so develop study skills, such as revision techniques, they may need for progression to higher level study. This also could provide one assessment per unit proving authenticity of students’ submissions. • Increased use of tutorials focussing on supporting students, particularly within their first year of National study. These tutorials should emphasise the BTEC ethos and so aid students’ awareness and understanding of the requirements and demands of the various assessment methods used, and how they should best approach their assessment work to ensure they meet stipulated deadlines.
Award Date Mar 1, 2012

Files

EdD Thesis - 'BTEC Assessment Practice' by Alan Carter - March 2012 (UWE).pdf (10.2 Mb)
PDF


Executive Summary - 'A Study of BTEC National Assessment Practice' - March 2012 (A Carter) [4].pdf (456 Kb)
PDF




Downloadable Citations