Summative and Formative Assessment, Validity and reliability

FORMATIVE ASSESSMENT

Determines specific student achievement

Principles

Practicality

Is said to be practical

Reliability

Must be consistent and relaible

Validity

Comply with rhe inferences made

Authenticity

Close relatioship between the tasks represented in the test

Techniques and strategies

Product-assessments

Deliver the final product

Performance-assessments

Develops your understanding in a practical way

Process-focused assessments

Establishes concrete steps

Impact on student learning

Increased ettention to students

Increased confidence

Can report non-conformities of adverse points

VALIDITY

Validity is approached in several

Content-related evidence

Sample the subject

Required perfomance of the examinee

Criterion-related evidence

To assess skill with simplicity

Oral fluency development

Vocabulary

Language flexibility

Grammar management

Construct-related evidence

Underlies a particular design and constructio

Background of knowledge in a specific area

Consequential validity

Addresses key issues

can separate other notions

Underlines the resulting consequences

high-stakes evaluations

SUMMATIVE ASSESSMENT

governs student performance on a numerical basis

Principles

Practicality

when you consider that the amount of resources is adequate for your accessibility

Reliability

must be consistent and reliable

Validity

must comply with the inferences made about the test results

Authenticity

it must have applicability and consistency in the test task.

Techniques and strategies

Product-assessments

final product has similarity to objects in the real world

Performance-assessments

monitoring or observation of knowledge to be applied in a real scenario

Process-focused assessments

steps used to achieve particular results

Impact on student learning

Benefits

in the form of professional qualifications

Functioning

of the criteria can ground their learning as they are assessed

Motivate

students to keep trying

Construction

of your own learning environment and development of metacognition

RELIABILITY

Consistency of the instrument

Student-related reliability

implies mastery or wisdom gained from taking the exam

They affect the physical and psychological factors of the students.

Rater reliability

trust placed in evaluators

human error or subjectivity can affect the scoring process

Inter-rater R


problems related to lack of experience may occur

drawbacks such as lack of attention to the qualification criteria

Internal consistency reliability

here the only evaluator is in charge of scoring the same data

Test administration reliability

conditions that shape the environment

Test reliability

does not construct a constant property of a test