Tag Archives: Quantitative research

Topic 0020: Assessment vs Evaluation

What is the difference between “assessment” and “evaluation?”

Assessment is the process of objectively understanding the state or condition of a thing, by observation and measurement. Assessment of teaching means taking a measure of its effectiveness. “Formative” assessment is measurement for the purpose of improving it. “Summative” assessment is what we normally call “evaluation.”

Evaluation is the process of observing and measuring a thing for the purpose of judging it and of determining its “value,” either by comparison to similar things, or to a standard. Evaluation of teaching means passing judgment on it as part of an administrative process.

Ideally, a fair and comprehensive plan to evaluate teaching would incorporate many data points drawn from a broad array of teaching dimensions. Such a plan would include not only student surveys, but also self-assessments, documentation of instructional planning and design, evidence of scholarly activity to improve teaching, and most importantly, evidence of student learning outcomes.

Reference: http://www.itlal.org/?q=node/93




TOPIC 0019: Writing Your Literature Review

What is the literature review?

  1. A literature review summarises, critically analyses and evaluates previous research available on the subject, presenting this in an organised way. It should address a clearly articulated question or series of questions
  2. It is NOT:
    • A descriptive list or summaries of books/articles etc
    • An exhaustive bibliography on everything ever written on the topic- you need to make a decision about what to include
    • Your arguments and ideas (like an essay)

Why do we write a literature review?

  • Demonstrate an in-depth understanding of your topic area including key concepts, terminology, theories and definitions
  • Identify who the major thinkers are
  • Identify what research has been done in that area
  • Find gaps in the research or current areas of interest to help you formulate your own research question
  • Identify the main research methodologies in your subject area
  • Identify main areas of agreement or controversy
  • convince the reader that your research questions are significant, important and interesting
  • convince the reader that your thesis will make an original contribution to the area being investigated.

Steps to complete the literature review

  1. Find relevant literature on your topic and follow trails of references
  2. Identify themes/ideas/theories/approaches to the topic that have emerged from reading
  3. Introduce ideas by themes/theory/approach/chronologically or any other appropriate structure but do not just list different authors’ viewpoints
  4. Introduce and explain each theme (or theory/approach), present evidence from readings (agreements/ disagreements), critically commentate and relate to your own research

Critical Questioning

  1. Who is the author?
  2. What is the authors central point or main argument?
  3. What findings and conclusions are made?
  4. What evidence is used to support the conclusions?
  5. Is the evidence relevant? What methodology has the author used? What are the strengths and limitations?
  6. Does the author make any assumptions?
  7. What is not being said?
  8. Is there any explicit or hidden bias?
  9. How is the text relevant to YOUR project or assignment?
  10. How does this link with other texts that you have read?

(SYTHESIZING INFORMATION REFER TO; Topic 0007: Matrix Method for Literature Review – Approaches to Identify Research Gaps and Generate RQ)

Figure 1 Structuring Literature Review

Topic  (broad to narrow)

Research Title: The Design and Developement of E-Portfolio for HIE’S in Social Sciences and Humanities

  • 2.1 Chapter Overview
  • 2.2 E-Learning in Malaysia
  • 2.3 E-Portfolio in HIE’s
  • 2.4 E-Portfolio Definition and Purpose
  • 2.5 E-Portfolio Reflective Learning Strategies
    • 2.5.1 Critical Thinking
    • 2.5.2 Problem-Solving
    • 2.5.3 Analytical Skills
  • 2.6 Conclusion and Gaps for Further Study

Critical Writing in a Literature Review

  1. Comparing and contrasting different theories, concepts etc and indicating the position you are taking for your own work
  2. Showing how limitations in others work creates a research gap for you.
  3. Strategic and selective referencing to support the underpinning arguments which form the basis of your research
  4. Synthesising and reformulating arguments from various sources to create new/more developed point of view
  5. .Agreeing with/defending a point of view or finding
  6. Accepting current viewpoints have some strengths but qualifying your position by highlighting weaknesses
  7. Rejecting a point of view with reasons (e.g. Lack of evidence)
  8. Making connections between sources

Adapted from RIDLEY, D 2008. The literature review: a step-by- step guide for students.  London: Sage

Topic 0018: Design and Development Methods and Strategies

Design and Development Methods & Strategies

Design and development research uses a wide variety of methodologies. Evaluation research techniques (both quantitative and qualitative) are also included in many studies.  The list of tables represent the methods commonly used in design and development research;

 Common Methods Employed in Design and Development Research

Type of Research Project Emphasis Research Method Employed
Product & Tool Research Comprehensive Design & Development Projects case study, content analysis, evaluation, field observation, in-depth interview
Product & Tool Research Phases of Design & Development Case study, content analysis, expert review, , field observation, in-depth interview, survey
Product & Tool Research Tool Development & Use Evaluation, expert review, in-depth interview, survey
Model Research Model Development Case study, delphi, in-depth interview, literature review, survey, think-aloud methods
Model Research Model Validation Experimental, expert review, in-depth interview
Model Research Model Use Case study, content analysis, field observation, in-depth interview, survey, think-aloud methods


The terms ‘mixed method research’ has been used to describe those studies that combine qualitative and quantitative methods. This is a way of using multiple approaches to answer given research questions.

Ross and Morisson (2004) support this trend when they take the position that qualitative and quantitative approaches are more useful when used together than when either is used alone and when combine, are likely to yield a richer and more valid understanding.

Johnson and Onwuegbuzie (2004) describe the areas of agreement between two advocates of both positions:

  • While individual perception are affected by one’s background experience, there are not different realities, but simply varying perceptions.
  • It possible for one data set to be compatible with different theories, i.e., there are alternative explanations of phenomena.
  • There is not always final proof of empirical research.
  • Researchers are bound by their values, attitudes and beliefs.

The fundamental agreement make mixed methods research not only practical in many situation, but also sound logically. However, in many design and development studies that employ multiple research methods are not mixing the qualitative and quantitative orientations. They are simply making use of a variety of similar strategies.

Topic 0017: Internet-Mediated Research (IMR)



State of art

Data-gathering stage that IMR methods have the biggest impact, as opposed to other stages and processes in the methodology of a piece of research, such as conceptualisation, interpretation, etc. (e.g.Hewson, 2008)

IMR approaches which seemed viable at this time included interviews, focus groups and observational studies which used linguistic data, and procedures for implementing these methods were devised and piloted (e.g. interviews: Chen & Hinton, 1999; Murray & Sixsmith, 1998; focus groups: Gaiser, 1997; Tse, 1999; Ward, 1999; linguistic observation: Bordia, 1996; Workman, 1992; Ward, 1999).

  • most relevant to qualitative social science research
  • cost and time efficiency,
  • expanded geographical reach, and
  • access to hard-to-reach

Tools, technologies, procedures

Synchronous approaches which gather data in real time

  • debate, clarification and resolution
  • more rigorous procedures
  • obtaining signed consent forms

Asynchronous approaches is to follow them in ‘real-time’ e.g, subscribe to a group and follow an ongoing discussion

Design issues and strategies

  • Solely text-based methods
  • Lack of physical proximity
    • facial expressions,
    • tone of voice,
    • body language, etc.

Depth and reflexivity

  • offering potentially more time for thoughtful, reflective responses
  • elicitation of richer,
  • more elaborate qualitative data.
  • enhance reflexivity and the depth and accuracy of interviewees’ responses Bowker & Tuffin (2004); Kenny (2005); McDermott & Roen (2012); Murray & Sixsmith (1998); O’Connor & Madge (2002).

Levels of rapport

Getting to know’ participants may help in maximising levels of confidence in the authenticity of the accounts they offer during the course of an interview.

Anonymity, disclosure, social desirability

  • high degree of privacy (Hewson et al, 2003)
  • higher levels of self-disclosure than ftf contexts (e.g. Bargh, McKenna & Fitzsimons, 2002; Joinson, 2001)

Principles for good practice in qualitative internet-mediated interview and focus group research.

  1. Use robust, well-tested procedures which have been well-piloted (reliability).
  2. Use the simplest low-tech solutions and equipment available that will do the job (reliability, accessibility).
  3. Use appropriate procedures for verifying identity (e.g. offline, audio / video) where this is crucial (e.g. highly sensitive research) (ethics, validity).
  4. Adopting clear strategies for establishing rapport has been shown to work well and is advisable (validity).
  5. Remain mindful of potential trade-offs when deciding upon procedures and making design choices, e.g. asynchronous approaches may facilitate depth and reflexivity but reduce conversational ‘flow’ (validity).
  6. Related to the above principle, remain aware of possibilities for combining methods, e.g. online and offline, asynchronous and synchronous, etc. (validity).
  7. Carefully consider security and confidentiality risks when making sampling and procedural design choices, and the ethical responsibility to inform potential participants of any non-trivial risks they may be unaware of (ethics, validity).
  8. Adopt procedures for maximising security and confidentiality where possible (e.g. setting up a dedicated online research site) (ethics, validity).
  9. Remain mindful of the possible threats to participant confidentiality and anonymity that can emerge from dissemination and publication procedures, and take careful measures to minimise these threats (ethics).
  10. Respect standards of privacy and netiquette, and pass participation requests through moderators where appropriate (e.g. sampling from mailing lists or newsgroups) (ethics).
  11. Make sure participation requests are well-constructed, containing information on researcher affiliations, contact details for further information, and value of the research (ethics, validity).
  12. Carefully consider how different sampling approaches and design procedures may facilitate or restrict access by different groups (accessibility, ethics).

Principles for good practice in qualitative internet-mediated observation and document analysis research.

  1. Keep in mind that different observation sites / sources may restrict or facilitate the design options available (e.g. using archived logs precludes participant approaches and makes disclosure / consent often implausible; observing real-time chat makes undisclosed, nonparticipant observation often untenable) (ethics, validity).
  2. Keep in mind the different types of dialogue and interaction that may be encouraged by synchronous (e.g playful) and asynchronous (e.g. reflective) technologies when selecting which is most appropriate (validity).
  3. Carefully consider whether undisclosed approaches are ethically justifiable, keeping in mind the following key factors: privacy expectations; sensitivity of data; levels of risk of harm; legal and copyright regulations; scientific value (ethics).
  4. Keep in mind that trade-offs will often emerge, especially in relation to ethics procedures, e.g. disclosure may increase the risk of reducing data authenticity and validity, but also reduce the risk of harming a group (ethics, validity).
  5. Keep in mind that it is often good practice to consult moderators before carrying out observation (e.g. of online discussion groups), particularly where disclosed approaches are proposed; however, moderators may also have agendas and opinions which could be prohibitive to the research, and in some not making contact is arguably justified (ethics, validity).
  6. Remain mindful of the increased traceability of data sources online, and the potential associated threats to anonymity / confidentiality, particularly in devising dissemination / publication strategies (ethics).
  7. Take steps to maximise data security, especially when utilising less secure technologies such as email (e.g. in soliciting documents) (ethics).
External Links


Topic 0016: Differentiate between Quantitative Research vs Qualitatitave Research

There are the different between Quantitative Research vs Qualitatitave Research;





Concerned with discovering fact about social phenomena. Concerned with understanding human behaviour form infromants perspective.


Assumes and fixed and measurable reality Assumes a dynamic and negotiated reality


  • To test theories
  • To construct facts
  • To differentiate
  • To correlate
  • To predict human behaviour
  • To explain statiscally
  • To complete the theories
  • To enhance knowledge
  • To explain facts
  • To explain the phenomena

Research Methods

  • Experimental
  • Quasi Experimental
  • Structure Interview
  • Structure Observation
  • Survey
  • Correlation


  • Observation
  • Non-structure interview
  • Document analysis
  • Case Study
  • Ethnography
  • Narrative
  • Grounded Theory


  • Formal Interview
  • Structured
  • Survey
  • Documentation
  • Observation
  • Informal Interview
  • Non-Structure
  • Questionnaires
  • Audio Visual


Analisis Data

  • Raw data by number
  • Statistical
  • Deductive
  • Programming software such SPSS, SEM-AMOS, SMART-PLS etc.
  • Raw data by verbatim
  • Continous analysis
  • Not a statistical
  • Inductive
  • Programming software such Atlas.ti, Nvivo etc.

Data Interpretation

Generalization Summarization


  • Variables
  • Validity


  •  Time-consuming
  • Reliability
  • No standardized procedure
  • Not for population


  • Explain
  • Variables
  • Hipotesis
  • Conceptual
  • Generalization
  • Explore
  • Element
  • Category
  • Theme
  • Patent
  • Model
  • Framework

Sampling Method

Probalility Sampling (Creswell, 2005), Kejcie and Morgan, 1970)

Refer Topic 0013: Choosing a Sampling Method

Non-Probalility Sampling  (Miles and Huberman,1994 dan Creswell, 2005)

Topic 0015: Need Assessment

Definition of Key Terms

“Need” refers to the gap or discrepancy between a present state (what is) and a desired state (what should be). The need is neither the present nor the future state; it is the gap between them.

“Target Group” Needs Assessments are focused on particular target groups in a system. Common target groups in education settings include students, parents, teachers, administrators, and the community at-large. Ideally, needs assessments are initially conducted to determine the needs of the people for whom the population or system. However, a “comprehensive” needs assessment often takes into account needs identified in other parts of a system. For example, a needs assessment might include the concerns of the “service providers” (e.g. teachers, guidance counselors, or school principals—the people who have a direct relationship with the service receivers) or “system issues” (e.g., availability of programs, services, and personnel; level of program coordination; and access to appropriate facilities).

A “Needs Assessment” is a systematic approach that progresses through a defined series of phases. Needs Assessment focuses on the ends (i.e., outcomes) to be attained, rather than the means (i.e., process). For example, reading achievement is an outcome whereas reading instruction is a means toward that end. It gathers data by means of established procedures and methods designed for specific purposes. The kinds and scope of methods are selected to fit the purposes and context of the needs assessment. Needs assessment sets priorities and determines criteria for solutions so that planners and managers can make sound decisions. Needs assessment sets criteria for determining how best to allocate available money, people, facilities, and other resources. Needs assessment leads to action that will improve programs, services, organizational structure and operations, or a combination of these elements.


Let’s take a quick look at general steps taken in a needs assessment.

  1. Exploration and identification. During the first phase of the needs assessment, you need to determine what you already know about your organization’s needs, whether it be additional resources, new technologies, or market expansion. It’s about figuring out where you are and where you want to be. You also need to discover other undisclosed needs that may be hindering you from moving from where you are to where you want to be. You will often rank these needs in order of importance. You will then set the scope of your research. In other words, the needs you are going to focus upon.
  2. Data gathering and analysis. At this stage you are collecting the information you need to better understand the gaps (needs) between where you are and where you want to be. Data may be collected from internal or externally records through research techniques such as surveys and document-analysis. After the data is collected, it is organized and analyzed.
  3. Utilization. This is where the data you analyzed is used to create a plan of action and implement it. You will set priorities, evaluate solutions, apply a cost-benefit analysis to determine which solution is best in light of the relative costs and benefits of each, formulate a plan to implement your solution, and then allocate the resources necessary for implementation. Again, the goal is to develop a plan to close the gaps between the desired future state and its current state.
  4. Evaluation.  You will evaluate the results of the action plan against the results: has the action plan placed you closer to where you want to be? Evaluation can help you determine what made an action plan successful or find the errors in your needs assessment. For example, did you miss an important gap, or were the resources you allocated insufficient to close the gap?

What Watkins Say?

Watkins describes analysis as “a process for breaking something down to understand its component parts and their relationships” (R. Watkins, personal communication, June 13, 2016). As to concluded, “A needs assessment identifies gaps between current and desired results and places those [gaps] in priority order on the basis of the costs to ignore the needs. They also stress that the gap is the need.


Figure 1: Needs Assessment vs. Needs Analysis Concept Map V2.0


Source adapt and taken from:

  1. http://study.com/academy/lesson/what-is-needs-assessment-definition-examples-quiz.html
  2. https://www2.ed.gov/admins/lead/account/compneedsassessment.pdf
  3. http://www.ispi.org/ISPI/Resources/PX/Articles/Editors__Pick/Needs_Assessment_vs_Needs_Analysis__What_s_the_Diff_.aspx