Acknowledgment v At the start of any new venture, it is my - - PowerPoint PPT Presentation

acknowledgment
SMART_READER_LITE
LIVE PREVIEW

Acknowledgment v At the start of any new venture, it is my - - PowerPoint PPT Presentation

Acknowledgment v At the start of any new venture, it is my pleasure to take a moment to give thanks. Id like first to thank Almighty God for


slide-1
SLIDE 1

Acknowledgment

v At ¡the ¡start ¡of ¡any ¡new ¡venture, ¡it ¡is ¡my ¡pleasure ¡to ¡take ¡a ¡moment ¡to ¡give ¡thanks. ¡I’d ¡ like ¡first ¡to ¡thank ¡Almighty ¡God ¡for ¡granting ¡me ¡good ¡health, ¡ability ¡and ¡opportunity ¡ to ¡accomplish ¡my ¡goals. ¡I ¡thank ¡my ¡adviser ¡Dr. ¡Javed Khan ¡for ¡challenging ¡me ¡to ¡push ¡ myself ¡harder ¡to ¡fulfill ¡expectations, ¡and ¡for ¡supporting ¡me ¡in ¡my ¡endeavors. ¡ v Along ¡with ¡my ¡advisor, ¡I ¡am ¡also ¡grateful ¡for ¡having ¡an ¡exceptional ¡doctoral ¡ committee ¡in ¡Dr. ¡Arvind ¡Bansal, ¡Dr. ¡Austin ¡Melton, ¡and ¡Dr. ¡Omar ¡De ¡LA ¡Cruz ¡Cabrera. ¡ v I ¡also ¡have ¡gratitude ¡for ¡professor ¡Dmitry ¡Ryabogin for ¡serving ¡in ¡my ¡committee, ¡and ¡ for ¡professor ¡Mahbobeh Vezvaei for ¡her ¡through ¡editing ¡including ¡conceptually ¡& ¡ mathematically ¡my ¡dissertation v I ¡would ¡also ¡add ¡a ¡special ¡thanks ¡for ¡Hessah Alqahtani for ¡her ¡standing ¡with ¡me. ¡ v My ¡special ¡thanks ¡to ¡the ¡stuff ¡of ¡Department ¡of ¡Computer ¡Science, ¡specially ¡for ¡Janet ¡ Kotila, ¡Marcy ¡Curtiss, ¡Nathan ¡Thomas ¡and ¡Brenda ¡Boykin ¡for ¡their ¡smiles ¡and ¡kind ¡

  • words. ¡

v I ¡would ¡also ¡like ¡to ¡take ¡a ¡moment ¡to ¡appreciate ¡the ¡support ¡of ¡my ¡parents ¡and ¡my ¡ family, ¡especially ¡my ¡grandmother ¡Aziza ¡Al ¡Kateeb, ¡and ¡my ¡lovely ¡aunts ¡Ghada Al ¡ Himali and ¡Ebtisam Al ¡Himali, ¡and ¡to ¡express ¡my ¡gratitude ¡to ¡my ¡soul ¡mother ¡Shadia Al ¡Himali. ¡Also, ¡the ¡Ministry ¡of ¡Higher ¡Education ¡of ¡Saudi ¡Arabia ¡for ¡their ¡financial ¡ support ¡for ¡my ¡graduate ¡study. v Last, ¡I ¡extend ¡many ¡thanks ¡to ¡my ¡colleagues ¡and ¡friends, ¡especially ¡Maha Allouzi, ¡ Amal ¡Babour and ¡Amani ¡Alnahdi. ¡I ¡add ¡my ¡gratitude ¡for ¡the ¡other ¡unnamed ¡people ¡ who ¡generously ¡supported ¡me ¡with ¡prayer ¡and ¡good ¡wishes. ¡ Rania ¡Anwar ¡Aboalela 11, ¡May ¡2017 ¡Kent, ¡Ohio

1

slide-2
SLIDE 2

An ¡Assessment ¡of ¡Knowledge ¡by ¡Pedagogical ¡ Computation ¡on ¡Cognitive ¡Level ¡Mapped ¡ Concept ¡Graphs

Rania Aboalela PhD Defense Department of Computer Science Kent State University

slide-3
SLIDE 3

The Outline

Ø Introduction Ø Problem ¡Definition Ø The Proposed Model and the Theory TCS2 Ø The Experiment to Validate the Concept States of the Theory of Cognitive Skill in Concept Space TCS2 Ø The Probabilistic Methods for Estimation the Learning States of the Concepts in each Zone Ø Contribution Ø List of Publications

3

slide-4
SLIDE 4

Introduction

  • 1. The Bloom Taxonomy (BT) & Revised Bloom Taxonomy

(RBT)

  • The original was created in 1956 by Dr. Benjamin Bloom.

(Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956 ).

  • To arrange what the learner has to be learned in a hierarchy
  • f six levels
  • (Anderson, et al., 2001). Six major categories were changed

from noun to verb forms and renamed

4

slide-5
SLIDE 5

Introduction

Bloom’s Noun Anderson’s version The Levels The Original Nouns Taxonomy The Levels The New Verb Taxonomy 6 Evaluation 6 Create 5 Synthesis 5 Evaluate 4 Analysis 4 Analyze 3 Application 3 Apply 2 Comprehension 2 Understand 1 Knowledge 1 Recall

5

The Original Bloom’s Noun VS. Verbs in Anderson’s Version

slide-6
SLIDE 6

Introduction

2. Assessing the Knowledge in One Domain Ø The Knowledge Assessment Theory (KAT) § KAT has been introduced by (Falmagne, Cosyn, Doignon, & Thiery, 2003) § They don’t identify cognitive difficulty variations of learning. § Only applicable for domain has clear relations like Mathematical Fields Ø Intelligent Design (Khan, Hardas, & Ma, 2005).

  • They introduced Topic Dependency Graph (TDG)
  • They identify the Ontological Relation between the concepts in the course and test

questions.

  • Intelligent design and evaluation of courses, learning materials and testing.

6

slide-7
SLIDE 7

Introduction (cont…)

3. Combining the Revised Bloom Taxonomy with Knowledge Space Ø (Khan, Nafa 2015, 2016, 2017, Aboalela, 2015).

  • Nafa, Khan and their colleagues (2015, 2016, 2017).
  • Automatically discovering and extracting the Bloom’s Taxonomy from the text in one

knowledge space

  • (Khan, Aboalela, 2015) Visualizing concept space of course content by three dimensions:

Syllabus, Ontological and Cognitive dimension.

7

slide-8
SLIDE 8

Ø The main problem in assessing the knowledge in one domain is to find the knowledge state of the assessed individual in a short time and minimum number of questions. Ø The knowledge structure of assessing the knowledge in one domain is complicated and required large space. Ø Can a technique and model help us to

  • bjectively

(algorithmically) understand, analyze, assess the concept states and skill levels of the learner with respect to the conceptual contents, concepts and their relationship that defines a specific knowledge domain?

8

Problem Definition

slide-9
SLIDE 9

The Proposed Model and the Theory

9

  • We provide:

1. Framework of Assessing the Knowledge in One Domain 2. Mapped Concept Graphs model and Methods to assess and analyze the Concept States of the learner in one domain.

slide-10
SLIDE 10

Ø The assessment input

  • A set of questions are mapped with concepts in the Concepts Space
  • Cognitive Level Mapped Concept Graph (CLMCG)
  • Set of learner responses to set of questions

Ø The goal is to find:

  • The output is zones of the concepts (which concepts are in which zone)
  • The amount of knowledge (learning states) of these concepts in various

zones (How much knowing the concept)

10

The Framework of the Assessment of Knowledge in One Domain

slide-11
SLIDE 11

11

  • We provide:

1. Framework of Assessing the Knowledge in One Domain 2. Mapped Concept Graphs Model and Methods to assess and analyze the Concept States of the learner in one domain.

The Proposed Model

slide-12
SLIDE 12

The Proposed Model and Methods

  • 1. The Cognitive Level Mapped Concept Graph (CLMCG)
  • 2. Concept Mapped Testing and Evaluation Method
  • 3. The Concept States Theory of Cognitive Skills in Concept

Space TCS2

  • 4. The Assessment Analytics

12

The mapped concept graphs model components:

slide-13
SLIDE 13

The Proposed Model and the Methods

  • 1. The Cognitive Level Mapped Concept Graph (CLMCG)
  • 2. Concept Mapped Testing and Evaluation Method
  • 3. The Concept States Theory of Cognitive Skills in Concept

Space TCS2

  • 4. The Assessment Analytics

13

Mapped concept graphs model components:

slide-14
SLIDE 14
  • 1. The Cognitive Level Mapped Concept Graph

(CLMCG)

  • 1. The syllabus dimension [Area Knowledge Space]
  • 2. The ontological dimension
  • 3. The cognitive dimension

14

slide-15
SLIDE 15
  • 1. The Cognitive Level Mapped Concept Graph

(CLMCG)

1. The syllabus dimension [Area Knowledge Space] 2. The ontological dimension 3. The cognitive dimension The syllabus relation retains the occurrence of the concepts in a formal textbook such that, the occurrence of the concepts in the chapter, section, sub-section, etc.

15

slide-16
SLIDE 16

16

Explicit Concept Hieratical Occur in link Implicit Concept Verb Node Verb & Noun Occur in link

The Syllabus Relation

The Leaves Nodes The fine concepts The Internal Nodes The concepts in the titles

  • Chapter title

The Root The concepts in the course name or the book name The Internal Nodes The concepts in the titles

  • Section title

Elements 3.1.1.0.1.9.n.s Use 3.1.1.0.1.5.v.s Sort Search Algorithm Strategy 3.1.1.0.1.8.n.s Algorithm 3.2.0.1.1.1.n.s Selection Sort Algorithm 3.1.1.0.0.0.n.p Sequence ¡of Sorting Algorithm 3.1.1.0.1.1.n.p Sequence of Items 3.1.1.0.1.3.n.p Items The Introduction to Algorithm 3.0.0.0.0.0.n.p Rearrangement 3.1.1.0.1.6.n.s Selection Algorithm The data structures with C using STL 0.0.0.0.0.0.n.p Simple search Algorithm 3.2.0.0.0.0.n.p Sorting Strategy 3.2.0.1.1.3..n.s Implements 3.2.0.1.1.2.v.s Begins with 3.1.1.0.1.2.v.p List 3.1.1.0.1.4.n.s Simple To order 3.1.1.0.1.7.v.p

slide-17
SLIDE 17
  • 1. The Cognitive Level Mapped Concept Graph

(CLMCG)

1. The syllabus dimension [Area Knowledge Space] 2. The ontological dimension 3. The cognitive dimension The ontological relation links the concepts in terms of class, part, and instances of, relationships. Example:

  • Sorting Algorithm is a sub-class of Algorithm
  • Selection Sort Algorithm is an instance of Sorting Algorithm
  • The bass of the Selection Sort Algorithm is a sup-part of Selection

Sort Algorithm

17

On B A

slide-18
SLIDE 18
  • 1. The Cognitive Level Mapped Concept Graph

(CLMCG)

  • 1. The syllabus dimension [Area Knowledge Space]
  • 2. The ontological dimension
  • 3. The cognitive dimension

§ The cognitive relation captures the relationship between the concepts in terms of prerequisite concept to be attained to know the target concept at certain skill level.

18

Lk

B A

slide-19
SLIDE 19

Labeling The Concept with Cognitive Skill Level

The Bloom relation is a direct link from the Parent A to the child B with link property that requires that to know the child with skill Lk we need to know the parent. According to the theory 1, B couldn’t be answered correctly unless A is known. Lk Lk B A

19

To [Lk] B one must know A (basic theory1) Ex: Ø To Apply Sorting the student must know the Sorting Algorithm. Ø To Analyze Sorting Algorithm the student must know the Running Time of the algorithm.

slide-20
SLIDE 20

The Proposed Model and Methods

  • 1. The Cognitive Level Mapped Concept Graph (CLMCG)
  • 2. Concept Mapped Testing and Evaluation Method
  • 3. The Concept States Theory of Cognitive Skills in Concept

Space TCS2

  • 4. The Assessment Analytics

2

The mapped concept graphs model components:

slide-21
SLIDE 21
  • 2. Concept Mapped Testing &

Evaluation Method

  • In order to measure the student learning we set up a concept

mapped testing and evaluation method.

  • A test is composed of set of questions.
  • Students are required to answer the questions based on their

knowledge.

  • Grader evaluates the student knowledge based on the answers
  • In conventional evaluation: a grader grades the answers and

assign a quantitative score for the student.

  • We slightly modify the evaluation method where the grader

instead of a numerical score, is asked to evaluate if there is evidence in the answer that the student has succeed or failed to attain learn a concept at a certain cognitive skill level.

  • We called it concept mapped testing & evaluation method

21

slide-22
SLIDE 22

Labeling The Tested Concept with the Question

22

To answer the question Qi correctly the concept Cx must be known at skill level Lk Qi Cx Lk Lk

  • 2. Concept Mapped Testing &

Evaluation Method

slide-23
SLIDE 23

The Proposed Model and Methods

  • 1. The Cognitive Level Mapped Concept Graph (CLMCG)
  • 2. Concept Mapped Testing and Evaluation Method
  • 3. The Concept States Theory of Cognitive Skills in Concept

Space (TCS2) 4. The Assessment Analytics

2 3

Mapped concept graphs model components:

slide-24
SLIDE 24

Ø The Assessment Theory TCS2, is a theory (a coherent group of tested general propositions that can be used for explanation and prediction for a class of phenomena) that helps us objectively (algorithmically) understand, and assess the learning states and skill levels of a learner. The theory composed the Concept States and the zones of the Concept State. Ø The Concept Zones (CZ) of the Learner:

  • It is the set of the concepts in the domain that he/she is able to answer the

related questions

  • The Concept Zones includes the concepts combined with the skill level
  • Each concept denoted by id number with superscript of the cognitive

skill level

  • Example of the Concept Zones:

CZ = {C1(L2),C2(L3),C3(L6),C4(L2)}

  • L2 indicates the skill level 2 which is the verb understand in the Bloom’s

Taxonomy

  • L3 indicates the skill level 3 which is the verb understand in the Bloom’s

Taxonomy

  • L4 indicates the skill level 4 which is the verb analyze in the Bloom’s

Taxonomy Hierarchy

24

  • 3. The Concept States Theory of Cognitive Skills in

Concept Space (TCS2)

slide-25
SLIDE 25

Ø The zones of the Concept States: 1. The zones of Verified Skills (VS) 2. The zones of Derived Skills (DS) 3. The zones of Potential Skill (PS)

25

  • 3. The Concept States Theory of Cognitive Skills in

Concept Space (TCS2)

slide-26
SLIDE 26
  • 3. The Concept States Zones

Ø The zones of the Concept States: 1. The zones of Verified Skills (VS) 2. The zones of Derived Skills (DS) 3. The zones of Potential Skill (PS) Ø After the assessed individual complete the assessment, his result will be in six binary concept states 1. Verified Known Skills (VKS) 2. Derived Known Skills (DKS) 3. Potential known skill (PKS) 4. Verified Not known Skills (VNS) 5. Derived Not known Skills (DNS) 6. Potential Not known Skill (PNS)

26

slide-27
SLIDE 27
  • 3. The Concept States Theory of Cognitive Skills in

Concept Space (TCS2)

The Verified Skills (VS) zones

27

If (Qi,Cx)Lk & Cx is correct answer è Cx ∈ Verified skills at level k VS(k) ∀ Cx∈ tested concepts. If (Qi,Cx)Lk & Cx is incorrect answer è Cx ∈ Verified skills at level k VNS(k) ∀ Cx∈ tested concepts.

Qi ∈Test questions. Cx ∈Tested concepts. Lk ∈Bloom link of level k. VS(k) ∈ Verified Skills at level k

Qi Cx Lk Lk Cx Lk Lk

Verified Known Skills (VKS) Verified Not known Skills (VNS)

Qi

slide-28
SLIDE 28

Derived Skill Zones: DS

28

Qi ∈Test questions. Cx ∈Tested concepts. Lk ∈Bloom link of level k. VS(k) ∈ Verified Skills at level k

  • The DS is the set of concepts in the prerequisite set at certain skill level of the tested

concepts and they have never been directly tested.

  • For example, Derived Skill (DS) at level skill level 2 is defined as where there is

indirect evidence that the concept Ci is understood or (not understood) by the student, then it will belong to DS (K=2)

  • After the assessed individual complete the assessment, his result will be in two

concept states Derived Known Skills (DKS)

Qi Cx Lk Ci De Derived+sk skill+at+level+k=2

Lm

Cx

Derived Not Known Skills (DNS)

Qi Cx Lk Ci De Derived+sk skill+at+level+k=2

Lm

Cx

  • 3. The Concept States Theory of Cognitive Skills in

Concept Space (TCS2)

slide-29
SLIDE 29

Is Cy in DS(k) where k>2? If Cy is known i.e. it is in DKS(2) or VKS(L2), and if all level k support nodes of Cy i.e. S(Cy,k) are in VKS(L2)∨DKS(L2), then Cy will be considered as a derived skill at level k. The S(Cy ,k) is the skill level k support set for Cy If Cy∈ DKS(L2) ∨ VKS(L2) and S(Cy,k) is subset of DKS(L2) ∨ VKS(L2) àCy∈ Lk Derived Known Skills Zones: DKS (k>2)

29

L2 Qx CC Derived ¡Known ¡skill ¡ at ¡skill ¡level ¡k CA CB Cx Derived known skills at level 2 Cy Lk Lk L2 L2 L2 Lk L2

  • 3. The Concept States Theory of Cognitive Skills

in Concept Space (TCS2)

slide-30
SLIDE 30

Potential Known Skill Zones: PKS(k) Let S(A,k) is the support set of A at level k. If every node in the S(A,k) is subset of VKS∨DKS at any level (doesn’t matter-because we only want to guarantee that the set is known) i.e. S(A,k) ⊂ VS() V DKS(), but there is no evidence that A is known, then A is in potential skill set PKS(k) i.e. A ∈ PKS(k)

Cd , Cx∈ VS CC , CA , CB ∈ DS Lk ∈ Bloom link at level k Lm ∈ Bloom link at level m

30

Lk Lk CA CB A Cx L2 L2 L2 L2 Qx CC De Derived ¡ ¡Kno nown ¡ n ¡skills lls ¡ ¡at ¡ ¡le level ¡ l ¡k ¡ ¡≥ ¡ ≥ ¡2 Cd Lm Lm Lk Po Potential ¡ ¡Kn Known ¡ ¡Skill Lk Lk Lk

  • 3. The Concept States Theory of Cognitive Skills

in Concept Space (TCS2)

slide-31
SLIDE 31

Simple Example of an Assessment Structure Based on the Assessment Analytics

31

L6 L3 C5 C6 C7 C2 C1 C3 C8 C4 L4 L2 L3 L5 L5 L5 L4 q1 q2 q4 q3 L3 L2 L4 Estimated skill levels L2 by DS Zones Estimated skill levels Lk more than 2 by DS Zones L5 Estimated Lk by PS Zones

slide-32
SLIDE 32

§ An Example of a Question

  • Show the order of elements in the [given] array after each pass of the Selection Sort
  • Algorithm. int arr[6] = {5, 1, 8, 2, 7, 9} [write the final result in the array]
  • Grader’s Evaluation:
  • Grader 1
  • Selection Sort Algorithm [OK], The pass of selection sort

algorithm [OK], Array[OK], The Order of the Element in the Array[OK], Order [OK], Sort [OK]. § In conventional, a grader evaluates the answers and assign a quantitative score.

32

  • 4. The Assessment Analytics of the Theory TCS2

Ø Connect the Questions to the Cognitive Level Mapped Concepts Concepts

  • Cognitive Analysis:
  • To answer question #1 correctly the concepts “Sort” and “Order”

need to be understood (L2)

  • The concept “Array” needs to be applied (L3)
  • the “Order” of the Element in “ the Array” needs to be analyzed (L4)
  • “The pass” of “selection sort algorithm” needs to be evaluated (L5) and applied (L3)

and the concept “Selection sort Algorithm” needs to be applied (L3)

slide-33
SLIDE 33

33

  • The Assessment Analytic of Perfect Answer
  • Order [OK], Array [OK], Sort [OK], Selection Sort Algorithm [OK] The passes of Selection Sort Algorithm [OK], The order of the element in the array [OK], ? The student got all the concepts!

O(nlogn)

2. 2.0 2. 2.0 2.0 .0 3.

  • 3. 0

Analysis

5. 5.3 2. 2.0 2. 2.1 2. 2.1

3.0

5.0/4.0

5.0/4.0

3.

  • 3. 9

3. 3.1 3. 3.1/ 1/2. 2.1 3. 3.0 2. 2.0

Si Simple Se Search

2. 2.0

5.3

4.

  • 4. 0/5.0

4.

  • 4. 0

3.

  • 3. 8/6.7

2. 2.0

Making

2. 2.0

Generic

2. 2.0 2. 2.0

O(n2)

2. 2.0 2. 2.0 2. 2.0

3. 3.0/ 0/6.

  • 6. ¡

¡20 20

2. 2.0 3. 3.16 16 3. 3.0 3. 3.0 3. 3.0 3. 3.0

5.0/4.0 5.0/4.0

2. 2.1 3. 3.16 16 3. 3.16 16 3. 3.0

5.10/4.0/3.0

3. 3.1 3. 3.0

Th The ¡ ¡index ¡ ¡of ¡ ¡ the the ¡ ¡smallest ¡ t ¡ el elem emen ent 3.1.1.2.1.2.n.p

3. 3.0 3. 3.1 4. 4.0/ 0/3. 3.0 3. 3.0 3. 3. 3. 3.0 5. 5.0/ 0/3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 4. 4.0 5. 5.0/ 0/3.

  • 3. ¡

¡0 5. 5.0

Th The ¡ ¡list 3.1.1.2.1.4.n.s

3. 3.0 3. 3.0 5. 5.0 3. 3. 3.0 3. 3.0 3. 3.0 4. 4.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3. 3. 3.0 3. 3.0 4. 4.0 3. 3.0/ 0/4. 4.2 3. 3.0/ 0/4. 4.2

El Elements

3.1.1.2.2.1.n.s

3. 3.0 3. 3.0 2. 2.0 Th The ¡ ¡ Ind Index

Sm Smallest

2. 2.0 2. 2.0

As Ascending ¡ ¡

2. 2.0 2. 2.0 2. 2.

Ex Example

2. 2.0 2. 2.0 3. 3.0 3. 3.0 2. 2.0 2. 2.0 2. 2.0 2. 2.0 2. 2.0 2. 2.0 2. 2.0 2. 2.0/ 0/3. 3.0 2.0 .0 2. 2.0 3. 3.0 2.0 .0

St Step

3. 3.0 3. 3.0 2. 2.0 4. 4.2/ 2/3.0 .0 3. 3.0 3. 3.0 2. 2.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0

4.0

Pa Passes Radix Sort Time Heap Sort Heap Making Generic Function

3. 3.0

Complexity

5.10/4.0/3.0

Qu Questio ion’s ¡ ¡An Answer ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Bl Blind ¡ ¡Co Concept ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Ve Verified ¡ ¡Concept ¡ t ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡De Derived ¡ ¡Concept ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Po Potential ¡ l ¡ ¡ ¡Concept ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Verifi fied ¡ ¡Li Link ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡De Derived ¡ ¡Li Link ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Po Potential ¡ l ¡Li Link

Q1A

4. 4.2/ 2/3. 3.1

5.3/3.0

3. 3.0 3. 3.0

5.0/4.0

2. 2.0 3. 3.1

5.0/4.0

3. 3. 3. 3. 3. 3.0 3. 3.0 3. 3.0/ 0/4. 4.0 2. 2.0

5.3

3.

  • 3. 8

3. 3.0 3. 3.0 3. 3.0 3. 3. 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0

Se Selection ¡ ¡So Sort ¡ ¡ Al Algorithm hm

3.1.1.0.0.0.n.p

Insertion Sort Algorithm

3. 3.0 4. 4.0 3. 3.0

Tra Trave vers rsal ¡ ¡of ¡ ¡ el elem emen ents 3.1.1.2.7.4.n.p

3. 3.0

5.0

5. 5.0/ 0/4.

  • 4. 0

3. 3.0 3. 3.0/ 0/4. 4.2 3. 3.0 3.

  • 3. ¡

¡0 4. 4.0/ 0/3. 3.0 3. 3.0 2. 2.0 3. 3.1 3. 3.0 3.

  • 3. 9

5. 5.0/ 0/4. ¡ . ¡0 3. 3.0 3. 3.0 3. 3.03. 3.0 3.

  • 3. ¡

¡0 3.

  • 3. ¡

¡0 3. 3.0 3.

  • 3. ¡

¡0 3. 3.0 3. 3.0 3.

  • 3. ¡

¡0 3.

  • 3. 0

3. 3.0 3. 3.0 3. 3.0 3. 3.0

Tra Trave vers rsal

3. 3.0 3. 3.0 3. 3.5 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0

Performance

3. 3.0 3. 3.0 3. 3.0

Su Sublist ¡ ¡Array 3.1.1.2.4.7.n.p

4. 4.0 3. 3.9 3. 3.1 3. 3.0

Su Sublist Co Content ¡ ¡ ¡ ¡of ¡ ¡ the the ¡ ¡Array ¡ ¡ 3.1.1.2.7.4.n.p

4. 4.0 4. 4.0

Il Illustration ¡ ¡ ex exam ample ¡o e ¡of ¡ ¡ 3.1.1.3.1.1.n.p Il Illustratio n

3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0

la larg rgest ¡ ¡ El Element 3.1.1.2.6.2.n.p Un Unsorted ¡ ¡ Or Order 3.1.1.2.3.5.n.p

sm smallest st ¡ ¡ el elemen ement ¡ ¡

3.1.1.2.3.2.n.p Th The ¡ ¡Ord rder ¡ r ¡

  • f
  • f ¡

¡element ¡ ¡ in in ¡ ¡the ¡ ¡array

3.1.1.1.1.2.n.p

Th The ¡ ¡ Pa Passes ¡ ¡of ¡ ¡ se selection ¡ ¡ so sort ¡ ¡ al algorithm

3.1.1.2.7.3. n.p

Pl Places Se Selection It Iterations 3.1.1.2.7.2.n. p Or Order Al Algorithm hm 3.1.1.2.1.1.n.p So Sort ¡ ¡Process 3.1.1.2.4.1.n.p Selection ¡Sort 3.1.0.0.0.0.n.p Simple Search Algorithm 3.1.0.0.0.0.n.p Radix Sort Algorithm 8.2.1.0.0.0.n.p Selection Sort running time O(n2)

3.3.3.4.1.8.n.p

Making algorithm Generic

3.5.0.0.0.0.n.p

Running Time 3.4.0.1.5.2.n.p Analysis of algorithms 3.3.0.0.0.0.n.p Re Resulting Pr Proces s

pl places ¡ ¡step. Of ¡ the ¡selection ¡ Sort ¡Algorithm 3.1.1.2.3.1.n.p

Selection ¡Sort ¡ Function ¡

3.3.1.5.0.0.n.p

Sort Algorithm Performance 3.1.0.0.0.0.n.p Re Resulting ¡ ¡list 3.1.1.1.1.1.n.p La Largest st As Ascending ¡ ¡ Or Order 3.1.1.1.1.2.n.p

2. 2.0 3. 3.0 2. 2.0

Heap sort Running Time O(nlogn) 3.5.0.0.0.0.n.p

5.3

Pla Places ¡ ¡ St Step 2. 2.0 3. 3.0

Po Position 3.1.1.2.4.2.n.p Th The ¡ ¡arra rray 3.1.1.1.1.2.n .s Heap Sort Algorithm 14.2.5.0.0.0.n.p

Q1A 3. 3.0

Co Content

2. 2.0 1 2 3 4 5 6 7 8 9 10 10 11 11 12 12 2. 2.0

Un Unsorted

13 13 14 14 15 15 18 18 16 16 19 19 20 20 33 33 21 21 22 22 36 36 24 24 25 25 26 26 27 27 28 28 29 29 30 30 31 31 32 32 34 34 35 35 23 23 37 37 38 38 39 39 40 40 41 41 42 42 43 43 44 44 2. 2.0 45 45 46 46 47 47 48 48 49 49 50 50 51 51 52 52 53 53 54 54 56 56 57 57 58 58 59 59 60 60 61 61 62 62 63 63 64 64 65 65 66 66 67 67 68 68 2. 2.0

slide-34
SLIDE 34

34

Derived Skill 3 according to the concept “Sort”

Qu Questio ion’s# ¡ ¡Answer ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Verif ifie ied ¡ ¡Co Concept ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡De Derived ¡ ¡Co Concept ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡Tested ¡ ¡concept ¡ ¡Li Link ¡ ¡ De Derived ¡ ¡Li Link Q1

Un Unsorted ¡ ¡ Or Order 3.1.1.2.3.5.n.p El Elements

3.1.1.2.2.1.n.s

Q1 2. 2.0 2. 2.0 2 .

Po Position 3.1.1.2.4.2.n.p Un Unsorted

2. 2.0 2. 2.0

la larg rgest ¡ ¡ El Element 3.1.1.2.6.2.n.p QA

3. 3.0 3. 3.0

So Sort ¡ ¡Process 3.1.1.2.4.1.n.p

3. 3.0 3. 3.0 3. 3.0

Sort

2. 2.0 2. 2.0 3. 3.0/ 0/2. 2.0

DS(3)={ “Smallest Element”, “Largest Element”}

2. 2.0 3. 3.0 3. 3.0/ 0/2. 2.0

Sm Smallest

3. 3.0/ 0/2. 2.0

La Largest st

sm smallest st ¡ ¡ el elemen ement ¡ ¡

3.1.1.2.3.2.n.p

12 12 13 13 16 16 29 29 31 31 33 33 32 32 24 24 15 15 6

slide-35
SLIDE 35

35

Potential Skills

3. 3.0

Insertion Sort Algorithm

3. 3.0

PS(3)={“The index of smallest element”, “Sublist Array”, “Simple search algorithm”, “Radix Sort Algorithm”, “Heap Sort Algorithm”, “Insertion Sort”}

El Elements

3.1.1.2.2.1.n.s

3. 3.0

Th The ¡ ¡index ¡ ¡of ¡ ¡ the the ¡ ¡smallest ¡ t ¡ el elem emen ent 3.1.1.2.1.2.n.p

3. 3.0 3. 3.0 3. 3.0

Co Content ¡ ¡ 3.1.1.2.7.4.n. p

3. 3.0 3. 3.0 3. 3.0

Th The ¡ ¡arra rray 3.1.1.1.1.2.n .s

3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0 3. 3.0

Su Sublist ¡ ¡Array 3.1.1.2.4.7.n.p la larg rgest ¡ ¡ El Element 3.1.1.2.6.2.n.p Or Order

2. 2.0 2. 2.0 2. 2.0

Sort Heap Sort Algorithm 14.2.5.0.0.0.n.p

3. 3.0 3. 3.0 3. 3.0 3. 3.0 2. 2.0

Radix Sort Algorithm 8.2.1.0.0.0.n.p Th The ¡ ¡Ord rder ¡ r ¡

  • f
  • f ¡

¡element ¡ ¡ in in ¡ ¡the ¡ ¡array

3.1.1.1.1.2.n.p

3. 3.0 3. 3.0 4. 4.2/ 2/3. 3.1

Simple Search Algorithm 3.1.0.0.0.0.n.p Un Unsorted ¡ ¡ Or Order 3.1.1.2.3.5.n.p

sm smallest st ¡ ¡ el elemen ement ¡ ¡

3.1.1.2.3.2.n.p As Ascending ¡ ¡ Or Order 3.1.1.1.1.2.n.p So Sort ¡ ¡Process 3.1.1.2.4.1.n.p

Q1A 3. 3.1 12 12 6 10 10 14 14 32 32 18 18 16 16 24 24 26 26 30 30 27 27 3 29 29 41 41 56 56 57 57 65 65

slide-36
SLIDE 36

The Outline

Ø Introduction Ø Problem ¡Definition Ø The Proposed Model and the Theory

ØThe Experiment to Validate the Concept States

Ø The Probabilistic Methods for Estimation the Learning States of Concepts in each zone Ø Contribution Ø Conclusion Future Works

36

slide-37
SLIDE 37

The Experiment to Validate the binary Concept States Assessment methods

  • 1. The assessment setup
  • 2. The validation of the binary state methods (The Matching

Analyses (The Correction) 3. The Efficiency of the method (The Size of Footprint)

37

slide-38
SLIDE 38

The Experiments to Validate the Concept States

§ I have done Institutional Review Board for Protection of Human Subjects (IRB ) exams to prepare a human subject test to get a permission to conduct human subject test . § I prepare a test that introduced online in one session. § The online test contained 47 questions. § The participants are 154 learners from graduate level, attending the CS 61002 Algorithms and Programming class in the Computer Science department. § Two types of questions (DQ and OQ) were administered to the learners. § The OQ are typical of the kind of questions that are usually given to the learners for their midterm exam. § The first 9 questions were selected from the OQ § The remaining 38 questions were direct questions asking about the exact skills that were extracted from the analysis of the OQ. § Therefore, each concept is evaluated using at least two questions at the same skill level of the concept. § The responses to the questions will form the dataset R. § The dataset R includes the set of the response probability to the asked questions about the concept at a certain skill level.

38

slide-39
SLIDE 39

The Experiments to Validate the Concept States

Value of Answer to Open Question Value of Answer to Direct Question Matching Result Validating Theory 1 1 1 Correct (+) 1 Correct (−) 1 False (+) 1 False (−)

Ø The experiment analysis Ø The value of 1 is given to each correct answer for each tested skill by OQ and DQ. Ø The value of 0 is given to each incorrect answer for each tested skill by OQ and DQ. Ø Thus, if the student’s answer to the identical tested skill had the same value either 0 or 1 in DQ and OQ, then the matching value will be 1,

  • therwise it will be 0.

39

slide-40
SLIDE 40

Type ¡of ¡Questions Type ¡of ¡ Estimated ¡Skill Bloom ¡Link Concepts ¡ Counting OQ ¡& ¡DQ V 2 1 OQ ¡& ¡DQ V 3 1 OQ ¡& ¡DQ V 4 1 OQ ¡& ¡DQ V 6 1 DQ DQ DQ DQ DQ

Counting Skills Result According to the Related Questions in the Figure ¡

Relation ¡Type C# Level# Related ¡OQ Related ¡ DQ

V 1 6 1 V 2 2 2 V 3 3 3 V 4 4 4 D D D D D D D P P

The Related Questions that Have to be Matched

40

C7 L4 C5 C6 C8 L5 L5 L5 L4 OQ 1 OQ 2 OQ 4 OQ 3 L3 L2 L6 L4 C2 C1 C3 C4 L3 L2 L3

slide-41
SLIDE 41

Type ¡of ¡Questions Type ¡of ¡ Estimated ¡Skill Bloom ¡Link Concepts ¡ Counting OQ ¡& ¡DQ V 2 1 OQ ¡& ¡DQ V 3 1 OQ ¡& ¡DQ V 4 1 OQ ¡& ¡DQ V 6 1 DQ DQ DQ DQ DQ

Counting Skills Result According to the Related Questions in the Figure ¡

Relation ¡Type C# Level# Related ¡OQ Related ¡ DQ

V 1 6 1 5 V 2 2 2 6 V 3 3 3 7 V 4 4 4 8 D D D D D D D P P

The Related Questions that Have to be Matched

41

C7 L4 C5 C6 DQ 5 DQ 6 C8 DQ 8 DQ 7 L3 L4 L6 L5 L5 L5 L2 L4 OQ 1 OQ 2 OQ 4 OQ 3 L3 L2 L6 L4 C2 C1 C3 C4 L3 L2 L3

slide-42
SLIDE 42

Type ¡of ¡Questions Type ¡of ¡ Estimated ¡Skill Bloom ¡Link Concepts ¡ Counting OQ ¡& ¡DQ V 2 1 OQ ¡& ¡DQ V 3 1 OQ ¡& ¡DQ V 4 1 OQ ¡& ¡DQ V 6 1 DQ D 2 3 DQ D 3 1 DQ D 4 1 DQ D 5 1

Counting Skills Result According to the Related Questions in the Figure ¡

Relation ¡Type C# Level# Related ¡OQ Related ¡ DQ

V 1 6 1 5 V 2 2 2 6 V 3 3 3 7 V 4 4 4 8 D 5 2 2 9 D 6 2 3 13 D 7 2 3 14 D 6 5 3 10 D 7 4 3 11 D 6 5 9 ¡& ¡10 D 7 4 11 ¡& ¡13 P 8 5 P 8 5

The Related Questions that Have to be Matched

42

L4 DQ 5 DQ 6 DQ 8 DQ 7 DQ 11 DQ 10 DQ 9 L3 L4 L6 L2 L5 L5 L2 L4 OQ 1 OQ 2 OQ 4 OQ 3 DQ 14 DQ 13 L2 L2 L4 L3 L2 L6 L4 C2 C1 C3 C4 C5 C7 C6 L3 L2 L3 C8 L5 L5

slide-43
SLIDE 43

Type ¡of ¡Questions Type ¡of ¡ Estimated ¡Skill Bloom ¡Link Concepts ¡ Counting OQ ¡& ¡DQ V 2 1 OQ ¡& ¡DQ V 3 1 OQ ¡& ¡DQ V 4 1 OQ ¡& ¡DQ V 6 1 DQ D 2 3 DQ D 3 1 DQ D 4 1 DQ D 5 1 DQ P 5 1

Counting Skills Result According to the Related Questions in the Figure ¡

Relation ¡Type C# Level# Related ¡OQ Related ¡ DQ

V 1 6 1 5 V 2 2 2 6 V 3 3 3 7 V 4 4 4 8 D 5 2 2 9 D 6 2 3 13 D 7 2 3 14 D 6 5 3 10 D 7 4 3 11 D 6 5

  • ­‑

9 ¡& ¡10 D 7 4

  • ­‑

11 ¡& ¡13 P 8 5

  • ­‑

9 ¡&12 P 8 5

  • ­‑

11&12

The Related Questions that Have to be Matched

43

L4 DQ 5 DQ 6 DQ 8 DQ 7 DQ 11 DQ 10 DQ 9 L3 L4 L6 L2 L5 L5 L2 L4 OQ 1 OQ 2 OQ 4 OQ 3 DQ 14 DQ 13 L2 L2 L4 L3 L2 L6 L4 C2 C1 C3 C4 C5 C7 C6 L3 L2 L3 C8 L5 L5 DQ 12 L5 L5

slide-44
SLIDE 44

The Validation of VS -The Matching

The percentage of the matching of skills in the set

  • f VS

44

X: ¡Matching ¡Result ¡1 ¡for ¡matching ¡ ¡or ¡0 ¡for ¡not ¡matching ¡ Y: ¡Percentage ¡of ¡the ¡Matching

2% 2% 1% 0% 3% 2% 2% 4% 10% 2% 12% 13% 5% 31% 13%

84% 83% 90% 58% 83% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Level 2 Level 3 Level 4 Level 5 Level 6 False (-) 0-1 False(+) 1-0 Correct (-) 0-0 Correct(+) 1-1

1-­‑1 ¡, ¡0-­‑0 ¡for ¡matching ¡ ¡ 1-­‑0, ¡0-­‑1 ¡for ¡not ¡matching ¡

slide-45
SLIDE 45

The Validation of DS-The Matching

The percentage of the matching of skills in the set

  • f DS

45

X: ¡Matching ¡Values ¡of ¡the ¡skill ¡levels Y: ¡Percentage ¡of ¡the ¡Matching 1-­‑1 ¡, ¡0-­‑0 ¡for ¡matching ¡ ¡ 1-­‑0, ¡0-­‑1 ¡for ¡not ¡matching ¡

2% 1% 1% 1% 0% 2% 2% 5% 0% 2% 15% 12% 23% 31% 3% 81% 84% 71% 67% 95% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Level 2 Level 3 Level 4 Level 5 Level 6 False (-) 0-1 False(+) 1-0 Correct (-) 0-0 Correct(+) 1-1

slide-46
SLIDE 46

The Validation of PS-The Matching

0% 3% 2% 3% 5% 89% 86% 92% 81% 79% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Level 2 Level 3 Level 4 Level 5 Level 6 False(+) 1-0 Correct(+) 1-1

The percentage of the matching of skills in the set of PS

46

X: ¡Matching ¡Result ¡1 ¡for ¡matching ¡ ¡or ¡0 ¡for ¡not ¡matching ¡ Y: ¡Percentage ¡of ¡the ¡Matching 1-­‑1 ¡, ¡0-­‑0 ¡for ¡matching ¡ ¡ 1-­‑0, ¡0-­‑1 ¡for ¡not ¡matching ¡

slide-47
SLIDE 47

The Efficiency of the Theory

Skill ¡Level ¡ parameter Verified VS Derived DS Potential PS L2

7 12 13

L3

4 11 11

L4

2 3 2

L5

1 2 2

L6

4 3 3

Sum

18 31 31

47

  • Size of Footprint based on Perfect Student Responses
  • The actual number of tested concepts are 18 concepts
  • The number of the estimated concepts according to DS method are 31

concepts

  • The number of the estimated concepts according to PS are 31 concepts

mean, the learner ready to know an additional 31

  • We can estimate the knowing of 80 concepts at skill level
  • Even though the particular directly tested concepts are 18 concepts

Size of Footprint (Number of Fundamental Tested Concepts) The perfect student is a student who answers all the questions correctly

slide-48
SLIDE 48

The Efficiency of the Theory

7 4 2 1 4 12 11 3 2 3 13 2 3 2 4 6 8 10 12 14 L2 L3 L4 L5 L6 Verified Derived Potential

48

X: The estimated skill level of the estimated concepts Y: The counting of the estimated concepts at the corresponded skill level

Size of Footprint (Number of Fundamental Tested Concepts)

slide-49
SLIDE 49

The Outline

Ø Introduction Ø Problem ¡Definition Ø The Proposed Model and the Theory TCS2 Ø The Experiment to Validate the Concept States of the Theory of Cognitive Skill in Concept Space TCS2 Ø The Probabilistic Methods for Estimation the Learning States of the Concepts in each zone Ø Contribution Ø Conclusion Future Works

4 9

slide-50
SLIDE 50

1. The reason of using Bayes’ Theorem 2. Illustration example of Bayes’ Theorem Computation 3. The Probability computation of the Concept State by Human Subject Test

5

The Probabilistic Methods for Estimation the Learning States of Concepts in each zone

slide-51
SLIDE 51

The Probability Computation to Estimate the Concept States of the Students

1. The reason of using Bayes’ Theorem

Why using Bayes’ Theorem? § The existence of the concepts in a complex domain such as there are variant prerequisite relations for the concept and the concept could be inferred by more than one concepts. § The conflicted evaluations of the concept such as the student has contradiction in the answer of the questions asked same concept § In real exams there are other phenomenal like lucky guess or careless mistakes § People have variant levels of initial knowledge

51

slide-52
SLIDE 52

1. The reason of using Bayes’ Theorem 2. Illustration example of Bayes’ Theorem Computation 3. The Probability computation of the Concept State by Human Subject Test

5 2

The Probability Computation to Estimate an accurate Evaluation of the Concept States

slide-53
SLIDE 53
  • 2. Illustration Example of Bayes’ Theorem computation

Verified Skills Zones

53

C2 C3 C4 C1 L3 L2 L6 L4 q2 q3 q4 q1

§ One evidence infers directly for one concept at certain skill level for example CZ

L6 , C[ L2 ,

C\

L3, C] L4

P(C^

_`|Qr) ¡= ¡P(Qr|C^ _`) =1 − ¡g

P(C^

_`|Q

br) ¡= ¡P(Q br|C^

_`) = 𝑛

  • Given :
  • P(Qr|C

e^

_`) = g, when there is dependency between Qr and C^, “r” indicated index of the question.

  • P(Q

bf|C^

_`) = m

  • Since ¡P(Qr|C

e^

_`) = g. Where, g = e then P(Qr|C^ _`) = 1 − ¡e = 1 − ¡g

  • Since ¡in ¡a ¡special ¡case ¡where ¡P(Q

bf|C^

_`) ¡= ¡𝑛. ¡"𝑛" is ¡the ¡error ¡value ¡related ¡to ¡the ¡asked ¡

question ¡qf, ¡then ¡P(Q bf|C e^

_`) ¡= ¡1 ¡− P(Qr|C

e^

_`) ¡= ¡1 − 𝑕 =1− ¡e ¡=1−𝑛

qr Question) Node

Cj

Concept) Node Answering the question qr correctly depends on knowing the the concept Cj at skill level k Lk

slide-54
SLIDE 54

2. Illustration Example of Bayes’ Theorem computation Derived Skill Zones

54

C2 L4 L2 L3 L3 C3 C4 C1 L3 L2 L6 L4 q2 q3 C7 C5 C6 L5 L4 q4 q1

§ One evidence infers directly for one concept at certain skill level for example CZ

L6 , C[ L2 ,

C\

L3, C] L4

§ Many evidences infer for one concept at the same skill level for example Cl

L2, Cm L2 , Co L4

qr Question) Node

Cj

Concept) Node Answering the question qr correctly depends on knowing the the concept Cj at skill level k Lk Source' concept Target' concept

Cs Ct

Lk Knowing the concept Ct at skill level Lk depends

  • n

knowing the the concept Cs at skill level L2

slide-55
SLIDE 55

2. Illustration Example of Bayes’ Theorem computation Potential Skills Zones

55

C2 L4 L2 L3 L3 C3 C4 C1 L3 L2 L6 L4 q2 q3 C7 C8 C5 C6 L5 L5 L5 L4 q4 q1

§ One evidence infers directly for one concept at certain skill level for example CZ

L6 , C[ L2 ,

C\

L3, C] L4

§ One evidence infers indirectly for one concept Cm

L5 , Co L2

§ Many evidences infer for one concept at the same skill level for example Cl

L2, Cm L2 , Co L4 &

Cp

L5

qr Question) Node

Cj

Concept) Node Answering the question qr correctly depends on knowing the the concept Cj at skill level k Lk Source' concept Target' concept

Cs Ct

Lk Knowing the concept Ct at skill level Lk depends

  • n

knowing the the concept Cs at skill level L2

slide-56
SLIDE 56

56

This example is to illustrate the most complicated evaluation of knowing the concept. c) Show the behavior of P(C

p

_l|R3,i)

i = 1,2,3,4,5 R3,1 = Q bZ R3,2 = Q bZ, Q[ R3,3 = Q bZ, Q[, Q\ R3,4 = Q bZ, Q[, Q b\ R3,5 = Q bZ,Q b[, Q b\, Q b]

Example of an Evaluated Concept by Many Evidences P(𝐃𝟗

𝐌𝟔|R3𝐣)

  • QrCorrect ¡response ¡
  • Qr

b

¡ ¡Incorrect ¡response

Given:

  • P(Qr|C

e^

_`) = g , when there is dependency between Qr ¡and ¡C^

  • P(Q

bf|C^

_`) = m

  • P(C^

_`) = d

  • PC

e^

_`) = 1 − ¡P(C^ _`) = 1 − ¡d

  • Since ¡P(Qr|C

e^

_`) = g. Where, g = e then P(Qr|C^ _`) = 1 − ¡e = 1 − ¡g

  • Since ¡in ¡a ¡special ¡case ¡where ¡P(Q

bf|C^

_`) ¡= ¡𝑛. ¡"𝑛" is ¡the ¡error ¡value ¡related ¡to ¡the ¡asked ¡

question ¡qf, ¡then ¡P(Q bf|C e^

_`) ¡= ¡1 ¡− P(Qr|C

e^

_`) ¡= ¡1 − 𝑕 =1− ¡e ¡=1−𝑛

C2 L4 L2 L3 L3 C3 C4 C1 L3 L2 L6 L4 q2 q3 C7 C8 C5 C6 L5 L5 L4 q4 q1 L5

slide-57
SLIDE 57

The Used Equation is the extended formula of Bayes’ Theorem

  • P(A|B), the posterior probability, is the probability for A after considering B for and against

A. P(A) is the unconditional probability of knowing the A, which is the initial probability of knowing the concept 𝐵.

  • Ri ¡is the set of the responses to the questions asked about the concept C
  • P(B|A), the conditional probability or likelihood, is the degree of belief in B, given that the

proposition A is true.

  • ¡P B A

b , the ¡conditional ¡probability ¡or ¡likelihood, is ¡the ¡degree ¡of ¡belief ¡in ¡B, given ¡that ¡the ¡proposition ¡A ¡is ¡false

  • P(A

b), ¡is ¡the ¡corresponding ¡probability ¡of ¡the ¡initial ¡degree ¡of ¡belief ¡against ¡A: ¡1 − P(A) ¡= ¡ P(A b)

57

P A B = P B A ∗ ¡P A P B A ∗ ¡P A + P B A b ∗ ¡P A b P(𝐃𝐤

𝐌𝐥|Ri) for many evidences of one concept at the same skill level

slide-58
SLIDE 58

The Used Equation is Extended Bayes Theorem

  • P(C^

_`|Ri) is the probability of knowing the concept ¡C^ at skill level k after considering the

responses set Ri.

  • C^

_` denotes knowing the concept ¡C^ at skill level k.

  • P(C^

_`) is the unconditional probability of knowing the concept C^ _`, which is the initial

probability of knowing the concept C^

_`. It is just the rate of the correct responses to the

questions asked about the concept C^

_`

  • Ri ¡is the set of the responses to the questions asked about the concept C
  • P(Ri|C^

_`) is the probability the responses (evidences) on the condition of knowing the

concept C^

_`

  • P Ri C

e^

_`

is the probability of the responses on the condition of not knowing the concept Cj

  • P C

e^

_` is the unconditional probability of not knowing the concept C^ _`, which is the initial

probability of not knowing the concept C e^

_`. It is just the rate of the incorrect responses to

the questions asked about the concept C e^

_`.

58

P(C^

_`|Ri) = } Ri C^ _` ∗} ~•

ۥ

¡} Ri C^ _` ∗ ¡} ~•

€• ‚} Ri C

e^

_` ∗ ¡} ~ b•

ۥ

P(𝐃𝐤

𝐌𝐥|Ri) for many evidences of one concept at the same skill level

slide-59
SLIDE 59

59

  • R\,\ = {Q

bZ, Q[, Q\}

  • P( ¡Cp

_l|R\,\)

= P R\,\ Cl

_[ ∗ P R\,\ Co _[ ∗ P Cp _l

P R\,\ Cl

_[ ∗ P R\,\ Co _[ ∗ P Cp _l + P R\,\ C

el

_[ ∗ P R\,\ C

ep

_l ∗ P C

ep

_l

= P R\,\ C[

_[ ∗ P R\,\ C\ _\ ∗ P Cp _l

R\,\ C[

_[ ∗ P RZ C\ _\ ∗ P Cp _l + P R\,\ C

e[

_[ ∗ P R\,\ C

e\

_\ ∗ P C

ep

_l ]

= ¡ P Q2 ¡ C[

_[ ∗ P Q3 C\ _\ ∗ P Cp _l

P Q2 ¡ C[

_[ ∗ P Q3 C\ _\ ∗ P Cp _l + P Q2 ¡ C

e[

_[ ∗ P Q3 C

e\

_\ ∗ P C

ep

_l

= ¡ (1 − g) ∗ (1 − g) ∗ P Cp

_l

(1 − g) ∗ (1 − g) ∗ P Cp

_l + g ∗ g ∗ P C

ep

_]

= ¡

(1 − 2g + g[) ∗ d (1 − 2g + g[) ∗ d + g[ ∗ (1 − d)

= ¡ d − 2gd + dg[ d − 2gd + dg[ + (g[−g[d)

qr

Question is answered incorrectly

C2 L4 L2 L3 C3 C4 C1 L3 L2 L6 L4 q2 q3 C7 C8 C5 C6 L5 L5 L5 L4 q4 q1 qr

Question is answered correctly

L3

P(Cp

_l|𝐒𝟒, 𝟒)

slide-60
SLIDE 60

60

§ Y : Probability of Knowing the Concept C8

L5

§ X : d1=0.0, d2= 0.25, d3=0.50, d4= 0.75, d5= 1 d6= The prober value of d is the ratio of the correct answers

§ * He got a correct answer

0.25 0.50 0.75 1 0.50 0.80 0.80 0.80 0.84 0.940.981 1 0.25 0.50 0.75 1 0.50 0 0.02 0.06 0.16 1 0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00 1.10 d1=0 d2=0.25 d3=0.50 d4=0.75 d5=1 (d6=no ¡information ¡(Equaly ¡Likely) d1=0 d2=0.25 d3=0.50 d4=0.75 d5=1 d6=Rate ¡of ¡correct ¡answers d1=0 d2=0.25 d3=0.50 d4=0.75 d5=1 d6=Rate ¡of ¡correct ¡answers d1=0 d2=0.25 d3=0.50 d4=0.75 d5=1 d6=Rate ¡of ¡correct ¡answers d1=0 d2=0.25 d3=0.50 d4=0.75 d5=1 d6=Rate ¡of ¡correct ¡answers P(C8L5|R3,1). ¡Where, ¡ R3,1 ¡= ¡{NQ1} *P(C8|R3,2). ¡Where, ¡ R3,2 ¡= ¡{NQ1,Q2} P(C8|R3,3). ¡Where, ¡ R3,3 ¡= ¡{NQ1,Q2,Q3} P(C8|R3,4). ¡Where, ¡ R3,4 ¡= ¡ {NQ1,Q2,NQ3} P(C8|R3,5). ¡Where, ¡ R3,5 ¡= ¡ {NQ1,NQ2,NQ3,NQ4}

The probability of errors e: g = m = 0.2

slide-61
SLIDE 61

1. The reason of using Bayes’ Theorem 2. Illustration example of Bayes’ Theorem Computation 3. The Probability Computation of the Concept State by Human Subject Test

6 1

The Probability Computation to Estimate an accurate Evaluation of the Concept States

slide-62
SLIDE 62
  • 3. The Probabilities Computation of the Human Subject

Test

Ø The Probability of Knowing the concepts in VS According to:

  • 1. The Evaluation of Selected Learner (most complicated answers)
  • 2. The Accurate Probability of Knowing & Not Knowing the Learning Object Domain
  • f 30 Students Based on the Evaluation Result of the Entire Concepts in VS

Ø The Probability of Knowing the Concepts in DS According to:

  • 1. The Evaluation of Selected Learner (most complicated answers)
  • 2. The Accurate Probability of Knowing & Not Knowing the Learning Object Domain
  • f 30 Students Based on the Evaluation Result of the Entire Concepts in DS

Ø The Probability of Knowing the Concepts in PS According to:

  • 1. The Evaluation of Selected Learner (most complicated answers)
  • 2. The Accurate Probability of Knowing & Not Knowing the Learning Object Domain
  • f 30 Students Based on the Evaluation Result of the Entire Concepts in PS

62

slide-63
SLIDE 63

The Assessment Structure of the Human Subject Test

63

slide-64
SLIDE 64

64

Comparison Between the Estimation of Knowing the Concept Based on the VS Method & Real Directly Response and the Computation (18 Concepts)

§ Y : Probability of Knowing the Concept § X : Concept ID

§ P(C|R) by VS method: is the probability of knowing the concept is estimated by indirect response § e= 0.1 the default value § e= 0.13 if the question is not clear or multiple choices § P(C|R) by Direct response : is the probability of knowing the concept estimated by direct response § P(C|R) by the computation of two responses: is the probability of knowing the concept mathematically calculated by many

  • bservation

(the two responses)

0.1 0.9 0.5 0.1 0.9 0.5 0.1 0.1 0.1 0.9 1 0.9 1 0.1 0.9 1 0.9 1 0.1 0.1 0.1 0.9 1 0.9 1 0.1 0.1 0.1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1

Verified'Skills

Student'23

slide-65
SLIDE 65

65

Comparison Between the Estimation of Knowing the Concept Based on the DS Method & Real Directly Response and the Computation (31 Concepts)

§ Y : Probability of Knowing the Concept § X : Concept ID

§ P(C|R) by DS method: is the probability of knowing the concept is estimated by indirect response § e= 0.2 the default value § e= 0.23 if the question is not clear or multiple choices § P(C|R) by Direct response : is the probability of knowing the concept estimated by direct response § e= 0.1 the default value § e= 0.03 if the question is not clear or multiple choices § P(C|R) by the computation

  • f

two responses: is the probability of knowing the concept mathematically calculated by many

  • bservation (the two responses)

0.23 0.87 0.67 0.23 0.87 0.67 0.23 0.13 0.00 0.00 0.23 0.10 0.00 0.00 0.00 0.23 0.10 0.00 0.00 0.23 0.10 0.00 0.77 0.90 1 0.77 0.90 1 0.23 0.90 0.73 0.77 0.87 1 0.20 0.13 0.04 0.20 0.13 0.04 0.80 0.87 1 0.80 0.90 1 0.20 0.10 0.00 0.20 0.13 0.00 0.20 0.13 0.00 0.20 0.10 0.00 0.20 0.13 0.00 0.20 0.13 0.00 0.20 0.13 0.00 0.20 0.13 0.00 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1

C3-L2 C47-L2 C70-L2 C71-L2 C66-L3 C66-L6 C67-L6 C5-L3 C5-L6 C4-L2 C7-L3 C26-L3 C13-L3 C7-L2 C26-L4 C49-L5 C49-L3 C23-L2 C1-L2 C40-L2 C40-L3 C46-L3 C17-L3 C75-L4 C63-L3 C62-L2 C25-L3 C64-L4 C64-L2 C53-L5 C53-L2

Student2#223

slide-66
SLIDE 66

66

Comparison Between the Estimation of Knowing the Concept Based on the PS Method & Real Directly Response and the Computation (31 Concepts)

§ Y : Probability of Knowing the Concept § X : Concept ID

§ P(C|R) by PS method: is the probability of knowing the concept is estimated by indirect response § e= 0.2 the default value § e= 0.23 if the question is not clear or multiple choices § P(C|R) by Direct response : is the probability of knowing the concept estimated by direct response § e= 0.1 the default value § e= 0.03 if the question is not clear or multiple choices § P(C|R) by the computation

  • f

two responses: is the probability of knowing the concept mathematically calculated by many

  • bservation (the two responses)

0.90 0.73 0.23 0.73 1 0.23 0.10 1 0.23 0.73 0.90 0.73 0.23 0.90 1 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1

Student2#223

slide-67
SLIDE 67

67

The Accurate Probability of Knowing Knowing & Not Knowing the Learning Object Domain of 30 Students Based on the Evaluation Result of the Entire Concepts in DS Example of 30 learners of 154 X: ¡Learner ¡Number Y: ¡Probability ¡of ¡knowing ¡& ¡not ¡knowing ¡the ¡concepts

P(CLO|R) =

𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 ‚𝐐 𝐒 𝐃

e𝐌𝐏 ∗ ¡𝐐 𝐃

e𝐌𝐏

P R C_ˆ = ¡The sum of the probabilities of knowing the concepts in the learning object domain for a single learner. = ∑ P

Š‹ Œ•Ž

  • ‹Z

(C•)

slide-68
SLIDE 68

68

The Accurate Probability of Knowing Knowing & Not Knowing the Learning Object Domain of 30 Students Based on the Evaluation Result of the Entire Concepts in VS Example of 30 learners of 154 X: ¡Learner ¡Number Y: ¡Probability ¡of ¡knowing ¡& ¡not ¡knowing ¡the ¡concepts

P(CLO|R) =

𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 ‚𝐐 𝐒 𝐃

e𝐌𝐏 ∗ ¡𝐐 𝐃

e𝐌𝐏

P R C_ˆ = ¡The sum of the probabilities of knowing the concepts in the learning object domain for a single learner. = ∑ P

Š‹ Œ•Ž

  • ‹Z

(C•)

0.97 1 0.87 0.970.990.98 0.75 0.86 0.990.99 0.94 0.98 0.97 0.98 0.30 0.92 0.89 0.03 0.13 0.030.010.02 0.25 0.14 0.010.01 0.06 0.02 0.030.02 0.02 0.70 0.020.040.01 0.08 0.11 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1

St.1 St.2 St.3 St.4 St.5 St.6 St.7 St.8 St.9 St.10St.11St.12St.13St.14St.15St.16St.17St.18St.19St.20St.21St.22St.23St.24St.25St.26St.27St.28St.29St.30

P(…

slide-69
SLIDE 69

69

The Accurate Probability of Knowing Knowing & Not Knowing the Learning Object Domain of 30 Students Based on the Evaluation Result of the Entire Concepts in PS Example of 30 learners of 154 X: ¡Learner ¡Number Y: ¡Probability ¡of ¡knowing ¡& ¡not ¡knowing ¡the ¡concepts

P(CLO|R) =

𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 𝐐 𝐒 𝐃𝐌𝐏 ∗ ¡𝐐 𝐃𝐌𝐏 ‚𝐐 𝐒 𝐃

e𝐌𝐏 ∗ ¡𝐐 𝐃

e𝐌𝐏

P R C_ˆ = ¡The sum of the probabilities of knowing the concepts in the learning object domain for a single learner. = ∑ P

Š‹ Œ•Ž

  • ‹Z

(C•)

1 0.99 1 0.990.97 1 1 1 0.92 1 1 0.99 1 1 0.98 1 1 1 1 1 1 1 0.950.990.98 1 0.990.94 1 0.95

0 0.01 0 0.01 0.03 0 0.08 0 0.01 0 0.00 0.02 0 0.05 0.01 0.02 0 0.01 0.06 0.05 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

St.1 St.2 St.3 St.4 St.5 St.6 St.7 St.8 St.9 St.10St.11St.12St.13St.14St.15St.16St.17St.18St.19St.20St.21St.22St.23St.24St.25St.26St.27St.28St.29St.30

PKS PNS

slide-70
SLIDE 70

Contribution

Ø We describe a method for defining and estimations of the learning zones (which concept in which zones) Ø We have given a methods for estimation of the knowledge states of each of the concept in these zones (how much one knows) Ø We show the estimation of the knowledge states provided by TCS2 can achieve high accuracy Ø The efficiency of the theory §Fast inference algorithm with minimum number of testing concepts §The amount of estimated concepts increases, even though the amount of tested concepts may be minimized and eliminated under the conditions laid down by the target skill levels §The existing of (CLMCG) minimize the time of the estimation of knowing and not knowing the concepts.

70

slide-71
SLIDE 71

Publication List

Ø R. ¡Aboalela, ¡J. ¡Khan, ¡"Visualizing ¡Concept ¡Space ¡of ¡course ¡content,” ¡IEEE ¡7th ¡ International ¡Conference ¡on ¡Engineering ¡Education ¡(ICEED). ¡pp.160-­‑165. ¡Japan, ¡ November ¡2015. ¡DOI: 10.1109/ICEED.2015.7451512

  • R. ¡Aboalela, ¡J. ¡Khan, ¡"Are ¡we ¡asking ¡the ¡right ¡questions ¡to ¡grade ¡our ¡students ¡in ¡a ¡

knowledge-­‑state ¡space ¡analysis?” ¡2016 ¡IEEE ¡8th ¡International ¡Conference ¡on ¡ Technology ¡for ¡Education ¡(T4E ¡2016). ¡pp. ¡144 ¡-­‑ 147, ¡Mumbai, ¡December ¡2016.

  • DOI: ¡10.1109/T4E.2016.037

Ø R. ¡Aboalela, ¡J. ¡Khan, ¡" ¡Model ¡of ¡Learning ¡Assessment ¡to ¡Measure ¡Student ¡ Learning: ¡Inferring ¡of ¡Concept ¡State ¡of ¡Cognitive ¡Skill ¡Level ¡in ¡Concept ¡Space,” ¡ 2016 ¡Third ¡International ¡Conference ¡on ¡Soft ¡Computing ¡and ¡Machine ¡Intelligence ¡ (ISCMI). ¡Dubai, ¡November ¡2016.

71

slide-72
SLIDE 72

Thanks

  • I ¡would ¡like ¡to ¡thanks ¡everybody ¡came ¡today ¡my ¡dissertation ¡

defends

  • For ¡more ¡information ¡visit ¡my ¡website

http://rania.medianet.cs.kent.edu:8080/Project/#

72