SE206 - Unit 4: Applying and Evaluating Agile in E-Government

Glossary of Terms

Key traits of Agile development methodologies mentioned: Modularity, Iterative (short cycles), Time-bound (1-6 week cycles), Adaptive (to risks), People-oriented, Collaborative, and Communicative (Slide 12).

(Agile Software Product Management) A component integrated into the proposed framework. It enables flexible requirement definition, handles large projects at a high management level, and facilitates links between project modules. Includes a Requirement Refinery process (Vision, Theme, Concept, Definition) (Slides 33-35, 41, 44, 48).

Large differences in the level of access to the internet and related technologies, affecting the ability to benefit from E-Government. Considered a major challenge (Slide 7).

The use of Information and Communication Technology (ICT), particularly the internet, as a tool to achieve better government services and operations (Slide 5).

A layered structure proposed, including Access Layer (PCs, mobile), E-government Layer (web portal), E-business Layer (ERP, DMS, data), and Infrastructure Layer (Servers, Networks) (Slide 9).

A metric proposed to measure the efficiency of sprints based on planned vs. actual time. Formula: (1 - (Unfinished Hours / Estimated Hours)) * 100. Used to evaluate framework performance (Slides 65, 67, 69, 86).

An Agile methodology suited for new/incomplete projects with changing requirements, focusing on the development process itself. Designed for small, collocated teams (2-12 members). Key practices include short cycles, refactoring, pair programming, test-first, etc. (Slides 19, 22, 24, 25, 42, 46, 50).

(4-Dimensional Analytical Tool) A tool integrated into the proposed framework used to select the appropriate Agile methodology (XP or Scrum) for specific tasks/modules based on project characteristics (Scope, Agility, Agile Values, Software Process dimensions) (Slides 21, 30, 37-39, 49, 50, 60-64, 84).

A checklist/model (Heeks, 2006) for understanding the components of an e-government system: Information, Technology, Processes, Objectives & values, Staffing & skills, Management systems & structures, Other resources (Slide 6).

A scale used to interpret the weighted average scores from satisfaction questionnaires (e.g., 1-5 scale where results are mapped to levels like Poor, Accepted, Agreeable, Satisfactory, Enriching) (Slides 82, 83, 87, 88).

A prioritized list of features, functions, requirements, enhancements, and fixes that constitute the changes to be made to the product in future releases. Used in Scrum and the proposed framework (Slides 27, 30, 34, 45, 47, 59).

A core XP practice of restructuring existing computer code—changing the factoring—without changing its external behavior. Used to improve code simplicity, understandability, and maintainability (Slides 24, 46).

An Agile framework focused on project management, suitable for environments with changing variables (requirements, tech). Works well for small teams (<10) and distributed teams. Key elements include Sprints, Daily Meetings, Product Backlog, Scrum Master (Slides 19, 26-28, 45, 47, 50).

A fixed-length iteration (typically 1-4 weeks, shown as 30 days or 2-6 weeks in diagrams) in Scrum, during which a potentially shippable product increment is created (Slides 27, 44, 45, 47, 48, 67-72).

Key Concepts from Unit 5

E-Government Fundamentals

  • Definition: Using ICT (esp. internet) for better government (Slide 5).
  • Environment: Influenced by Technology, People, Process, Social Culture (Slide 4).
  • Model (Heeks): ITPOSMO checklist helps analyze components (Slide 6).
  • Challenges: Primarily Digital Divide, also financial and legal barriers (Slide 7).
  • Architecture: Layered approach - Access, E-Gov, E-Business, Infrastructure (Slide 9). E-Gov layer affected by culture, feedback, policy, HCI (Slide 10).

Agile Principles & Comparison

  • Characteristics: Iterative, Time-bound, Adaptive, People-oriented, Collaborative, Modular (Slide 12).
  • Advantages: Incremental delivery, early feedback/issue detection, response to change, customer satisfaction (Slides 13, 14).
  • Agile vs. Traditional:
    • Traditional: Fixed Functionality, Flexible Time/Resources. Sequential approach (Slide 15, 17).
    • Agile: Fixed Time/Resources, Flexible Functionality. Iterative approach with feedback loops (Slide 15, 16).
  • Applicability (Radar Chart): Agile often suits higher Dynamism, smaller Teams/Size (initially), cultures thriving on change, and lower Criticality projects compared to traditional methods (Slide 18).

XP and Scrum Overview

  • XP (Extreme Programming): Suited for changing requirements, small collocated teams. Focuses on development practices (refactoring, pairing, TDD). Strengths: Simple solutions, code quality. Limitations: Scalability, collocation requirement (Slides 22, 24, 25).
  • Scrum: Suited for changing environments, small or distributed teams. Focuses on project management (sprints, roles, meetings). Strengths: Handles large teams via division, adapts to change. Limitations: Less focus on specific engineering practices (Slides 26, 28).
  • Comparison: XP/Scrum combination scores highly on productivity, quality, satisfaction. Both show high degrees of agility (Slides 20, 21).

Proposed E-Government Agile Framework

  • Objective: Utilize merits of Agile (XP, Scrum) enhanced by SPM and 4-DAT for E-Government projects (Slide 29).
  • Components: Vision -> SPM Phase (Requirement Refining/Estimation) -> Product Backlog (PMSB) -> 4-DAT (Method Selection) -> Development Phase (XP or Scrum) -> Integration/Testing -> Update Backlog (Slide 30).
  • Agile SPM Role: Flexible requirement definition (Vision->Theme->Concept->Definition), high-level management for large projects, standardizes interfaces (Slides 33, 34).
  • 4-DAT Role: Analytical tool to choose between XP or Scrum based on task/module characteristics (Scope, Agility, Values, Process) (Slides 37-39).
  • Integration: Framework combines SPM's planning with 4-DAT's selection mechanism to feed into either XP's development-focused cycle or Scrum's management-focused cycle (Slides 41-45).
  • Features: Aims to combine the strengths of SPM, 4-DAT, XP, and Scrum (Slide 50).

Framework Evaluation Approach

  • Methods:
    • Quantitative: Sprint efficiency measured by Effectivity Score.
    • Qualitative: Stakeholder, User, and Teamwork satisfaction measured via questionnaires and analyzed using SPSS and Likert scales (Slides 65, 74).
  • Effectivity Score: Measures time efficiency per sprint. Effectivity Score = (1 - (Unfinished Hours / Estimated Hours)) * 100 (Slide 69). Unfinished Hours represent the deviation (over/under) from the estimate.
  • Average Metrics: Formulas provided for calculating average effectivity score (weighted by estimated hours) and average sprint performance (weighted by number of tasks) across multiple sprints (Slides 71, 72). Avg Effectivity = (Σ(E_i * H_i) / Σ(H_j)) * 100 Avg Performance = (Σ(E_i * S_i) / Σ(S_j)) * 100 Where E=Effectivity, H=Estimated Hours, S=Tasks, i/j=sprint index.
  • Satisfaction Evaluation: Uses questionnaires with Likert scale responses (e.g., 1-Poor to 5-Enriching). Weighted Averages (W.A.) calculated via SPSS determine overall satisfaction levels (Slides 75-83).

Essay Questions

Click on a question to reveal the answer based on the presentation material.

The presentation proposes a framework designed to apply Agile methodologies effectively to E-Government projects by integrating the strengths of several components (Objective on Slide 29, Framework Diagram on Slide 30, 32, 36, 40, 43).

Main Components and Flow:

  1. Vision: The starting point, defining the overall goal or direction of the E-Government project.
  2. SPM Phase (Agile Software Product Management): This phase takes the vision and performs:
    • Requirement Refining: Breaks down high-level needs using stages like Vision -> Theme -> Concept -> Requirements Definition (Slide 34).
    • Effort Estimation: Estimates the work required for the refined requirements.
    This phase produces prioritized requirements, likely feeding into the backlog.
  3. Product Management Sprint Backlog (PMSB): This backlog holds the refined and estimated requirements/tasks derived from the SPM phase.
  4. 4-DAT (4-Dimensional Analytical Tool): Acts as a decision point. Based on the characteristics of tasks selected from the PMSB (analyzing Scope, Agility, etc.), 4-DAT determines whether XP or Scrum is the more suitable methodology for developing that specific task or module.
  5. Development Phase: The actual implementation work occurs here, using the methodology selected by 4-DAT:
    • XP: If selected, development follows XP practices (pairing, TDD, refactoring, etc.).
    • Scrum: If selected, development follows Scrum practices (sprints, daily meetings, etc.).
  6. Integration & System Testing: After development sprints/iterations, the completed work is integrated and tested as a whole system.
  7. Update Backlog: Feedback from testing or new insights might lead to updates or additions to the PMSB, creating a continuous feedback loop.

Overall Goal: The framework aims to leverage SPM for better requirement management, use 4-DAT for flexible methodology selection, and apply XP/Scrum for efficient development, ultimately improving the process for complex E-Government projects (Slide 84, 85).

The presentation provides insights into both XP and Scrum, highlighting their characteristics and how they compare (Slides 19-28):

Extreme Programming (XP):

  • Suitability: Best for new or incomplete projects where requirements are expected to change frequently. Designed for small (2-12 members), collocated project teams (Slide 22).
  • Focus: Emphasizes the actual development process and technical practices (Slide 22, 24).
  • Strengths: Delivers simplest solutions in short cycles; frequent refactoring improves design; frequent reviews enhance quality (Slide 24).
  • Limitations: Lacks scalability for large teams/products; requires the team to be physically collocated (Slide 25).
  • Key Practices Examples: Pair programming, Test-first, Refactoring, Continuous Integration, Small Releases (Slide 46).

Scrum:

  • Suitability: Good for projects where environmental and technical variables (requirements, tech, resources) are likely to change. Suitable for small teams (<10) but can be adapted for larger projects and works for distributed teams (Slide 26).
  • Focus: Emphasizes project management techniques and adapting to change (Slide 28).
  • Strengths: Allows large teams to work like small teams by dividing work and synchronizing; effective at managing change and fixing problems continuously (Slide 28).
  • Limitations: Does not focus heavily on specific development/engineering practices (Slide 28).
  • Key Practices Examples: Sprints, Daily Meetings, Product Backlog, Scrum Master, Time Boxing (Slide 47).

Comparison Summary:

  • XP is more prescriptive about how to build software (engineering practices), while Scrum focuses more on how to manage the project (process framework).
  • XP traditionally requires collocation, while Scrum is more adaptable to distributed teams.
  • Both score well in user satisfaction and productivity when potentially combined (Slide 20).
  • Both exhibit high degrees of agility, with Scrum potentially showing slightly higher practice agility in the 4-DAT comparison shown (Slide 21).

The proposed framework leverages this difference by using 4-DAT to select XP for tasks benefiting from its technical rigor and Scrum for tasks needing its management flexibility.

Agile Software Product Management (SPM) is presented as a key component integrated at the beginning of the proposed framework for E-Government projects (Slides 30, 33-35, 48).

Role of Agile SPM:

  • Flexible Requirement Definition: It enables customers or stakeholders to define requirements flexibly, moving away from rigid upfront specifications (Slide 33).
  • High-Level Management: It provides a mechanism to handle large projects by managing requirements and scope at a higher level before diving into development iterations (Slide 33).
  • Facilitating Integration: It helps define roles, regulations, and standards that facilitate the links and interfaces between different project modules, which is crucial for complex E-Government systems (Slide 33).
  • Requirement Refinement: It structures the process of breaking down high-level goals into actionable development tasks.

SPM Requirement Refinery Process (Slide 34):

The framework adopts an SPM technique called Requirement Refinery, which includes four stages to progressively detail requirements over time:

  1. Vision: The highest-level goal or direction for the product/project.
  2. Theme: Broad categories or areas of functionality that support the vision (e.g., User Management, Reporting). Appears further out in the planning horizon (e.g., 6+ months).
  3. Concept: More specific features or epics within a theme (e.g., Search Module, User Registration). Planned in a medium-term horizon (e.g., 3-6 months).
  4. Requirements Definition: Detailed, specific requirements or user stories ready for implementation, often planned for the near term (e.g., next 1 month or sprint). This stage translates concepts into specific, described requirements, potentially leading to data models and technical specifications (Slide 35).

Essentially, Agile SPM acts as the planning and requirement shaping engine in the framework, ensuring that the work fed into the development phase (XP/Scrum) via the Product Management Sprint Backlog (PMSB) is well-understood, prioritized, and manageable.

The 4-Dimensional Analytical Tool (4-DAT) is presented as a mechanism within the proposed framework to aid in selecting the most appropriate Agile methodology (specifically between XP and Scrum in this context) for developing particular parts of the system (Slides 30, 37-39, 49).

Dimensions of 4-DAT (Slide 37):

It analyzes project or module characteristics across four dimensions:

  1. Scope Dimension: Checks the methodology's support for various project context factors, including: Project size, Team size, Development style, Code style, Technology environment, Physical environment, and Business culture (Slide 38).
  2. Agile Values Dimension: A quantitative measure checking the *existence* of agility within the methodology at both process and practice levels (Slide 39).
  3. Agility Dimension: Examines the methodology's support for *characterizing* agile values across different practice levels (Slide 39).
  4. Software Process Dimension: Examines the specific practices within the methodology that support the components characterizing the software process in Agile (Slide 39).

Usage in the Framework:

  • Decision Point: 4-DAT sits between the Product Management Sprint Backlog (PMSB) and the Development Phase (XP/Scrum) (Slide 30).
  • Methodology Selection: For a given task or module taken from the PMSB, 4-DAT is used to evaluate how well XP and Scrum fit the characteristics of that specific task based on the four dimensions.
  • Example Application (Slides 60-63): The presentation shows an example where 4-DAT (focusing on the Scope dimension criteria like Task size, Team structure, Tech needs) is used to score XP and Scrum for building a "Search GUI module" and "Search coding and testing".
    • For the GUI: XP scored higher (0.75 vs 0.25), suggesting it was more appropriate.
    • For Coding/Testing: Scrum scored higher (0.63 vs 0.37), suggesting it was chosen for that part.
  • Flexibility: By using 4-DAT, the framework gains flexibility, allowing different parts of the same project to be developed using the methodology best suited to that part's specific context and needs (Slide 85).

Slide 7 explicitly discusses challenges faced by E-Government initiatives. The main challenges highlighted are:

  • Digital Divide: This is presented as a primary challenge, defined as "large differences in the level of access to the internet and therefore ability to benefit from e-government". This implies that unequal access to technology prevents equitable use of online government services.
  • Financial Barriers: The cost associated with developing, implementing, and maintaining E-Government systems, as well as the cost for citizens to access the necessary technology (computers, internet connection), can be significant barriers.
  • Legal Barriers (implied by 'Leg'): Issues related to legislation, regulations, privacy laws, data security standards, and digital signatures can pose hurdles for implementing E-Government services that handle sensitive information or transactions.
  • Technological Knowledge/Skills Gaps (implied by 'Techn Age'): Both government staff and citizens may lack the necessary technical skills or literacy to effectively develop, manage, or use E-Government systems. This relates to training needs and usability design.

Additionally, while not explicitly listed on slide 7, the general context of government projects often involves:

  • Integration with Legacy Systems: Governments often have older systems that need to be integrated with new E-Government platforms.
  • Bureaucracy and Resistance to Change: Established processes and organizational structures can make adopting new technologies and workflows difficult.
  • Changing Political Priorities and Policies: Government projects can be affected by shifts in policy or leadership.

The presentation suggests that Agile methods, particularly within the proposed framework, can help address some of these challenges, especially those related to changing requirements and managing complex projects.

The Effectivity Score is introduced as a quantitative metric to measure the efficiency of the implemented framework by evaluating the performance of individual sprints (Slides 65, 67, 69, 86).

Calculation (Formula from Slide 69):

Effectivity score = { 1 - (Unfinished Hours / Estimated Hours) } * 100

Where:

  • Estimated Hours: The planned hours for the tasks within a sprint, determined during the requirements refinery/planning stage (likely from SPM).
  • Unfinished Hours: This represents the difference between the *actual* hours spent and the *estimated* hours. It captures the deviation or the amount of planned work (in hours) that was effectively "unfinished" or "overrun" relative to the estimate at the end of the sprint. If actual hours exceed estimated hours, this value is positive; if actual is less than estimated, it implies work finished early relative to the estimate. *(Note: The name "Unfinished" might be slightly confusing; it essentially measures the estimation accuracy or time variance.)* A practical interpretation from the example table (Slide 68) suggests it's `Actual Hours - Estimated Hours` if positive, or potentially capped at 0 if work finishes early, although the formula as written calculates efficiency based on deviation.* Let's refine: Slide 68 shows 'Unfinished' as a separate column, likely representing estimated work NOT completed or carried over. However, the calculation implicitly uses the variance. Let's assume the formula measures how much of the *estimated* time was *effectively used* without overrun. If Unfinished = 12 and Estimated = 200, score = (1 - 12/200)*100 = (1 - 0.06)*100 = 94%. This seems to measure completion *relative* to estimate.* A simpler interpretation aligned with the name: It measures the percentage of *estimated work* that was successfully completed within the sprint timeframe, assuming 'Unfinished Hours' represents the estimated hours for tasks *not* completed.* Let's stick to the formula's direct implication: it measures efficiency based on how much the actual time differed from the estimated time.* Re-evaluating based on the Average calculation (Slide 71) E_i * H_i suggests E_i is the score for sprint i. The formula measures how close the team came to meeting the estimated hours. A score of 100 means Actual = Estimated (or less, assuming Unfinished cannot be negative). A score < 100 means Actual > Estimated.* Final Interpretation based on Formula: It measures the sprint's time efficiency relative to its plan. A higher score indicates better adherence to the estimated time, potentially reflecting good estimation or high productivity. A score of 100 means the work was completed within or exactly at the estimated time. A score below 100 indicates the work took longer than estimated.

    What it Measures:

    • Time Efficiency / Estimation Accuracy: Primarily, it reflects how well the team's actual work duration matched the planned estimates for a given sprint.
    • Sprint Performance Proxy: It serves as an indicator of the sprint's success in terms of delivering the planned scope within the expected time budget.
    • Progress Evaluation: The presentation suggests it can be a "fair measure for evaluating the progress of a project development" (Slide 86). By tracking this score over time, teams can potentially identify trends in estimation accuracy or productivity.

    In the BSIC case study, the average Effectivity Score across 15 sprints was calculated as 86.49% (Slides 70, 86).

Stakeholder and user satisfaction were evaluated as part of the framework's performance assessment in the Business Sector Information Center (BSIC) case study, using qualitative methods (Slides 65, 74-78, 83, 87-88).

Evaluation Method:

  • Questionnaires: Specific questionnaires were designed for different groups:
    • Stakeholders Satisfaction Questionnaire: Assessed aspects like GUI usability, completion of operations, ease of understanding, problem-solving, meeting requirements, delivery time/budget, ROI, and welcoming changes (Slide 75).
    • Users Satisfaction Questionnaire: Focused on the end-user experience with the developed portal, covering friendliness, ease of use, completion of operations, regular updates, error-free running, technical support communication, problem resolution, meeting needs, search efficiency, navigation, and team's responsiveness to changes (Slide 77).
  • Analysis: The responses were gathered and analyzed using the SPSS (Statistical Package for the Social Sciences) software (Slide 87).
  • Interpretation: The results were interpreted using a Likert scale (1-5), where the calculated Weighted Average (W.A.) score for each question or category was mapped to a satisfaction level (e.g., Poor, Accepted, Agreeable, Satisfactory, Enriching) based on predefined ranges (Slide 82).

Key Findings (Slides 76, 78, 83, 88):

  • Stakeholders Satisfaction: The average Weighted Average (W.A.) score was 3.76. According to the Likert scale interpretation (3.40-4.19 = Satisfactory), stakeholder satisfaction was rated as Satisfactory. They particularly rated ROI and Problem Solving highly (W.A. 4.67 and 4.33 respectively), while supporting team satisfaction was lower (W.A. 2.33).
  • Users Satisfaction: The average W.A. score was 4.22. According to the Likert scale (4.20-5.00 = Enriching), user satisfaction was rated as Enriching. Users rated meeting their needs and the team welcoming changes very highly (W.A. 4.57), along with ease of navigation and friendliness (W.A. 4.40 and 4.43). Search efficiency was rated slightly lower but still well within Satisfactory (W.A. 3.77).

Overall, the evaluation indicated positive reception from both stakeholders and especially end-users regarding the outcome of the project developed using the proposed Agile framework.

A specific questionnaire was administered to the development team members who had experience with both the proposed Agile framework (used in the BSIC project) and traditional development methods. This aimed to compare their satisfaction levels with the teamwork aspects under both approaches (Slides 74, 79-81, 83, 88).

What Was Evaluated (Teamwork Satisfaction Questionnaire - Slide 79):

The questionnaire covered several aspects of the team's experience, including:

  • Ability to learn new methodologies.
  • Degree of collaboration within the work team.
  • Adequacy of time scheduled for tasks.
  • Collaboration with the customer.
  • Ability to prioritize satisfying the customer.
  • Level of team self-organization (rules, regulations).
  • Conflict resolution effectiveness.
  • Punctuality in delivering according to schedule.
  • How much the methodology assisted work development.
  • Potential for applying the methodology to future projects.

Results (SPSS Output on Slide 80, Summary on Slides 81, 83, 88):

  • Data Analysis: Responses were analyzed using SPSS to calculate the Weighted Average (W.A.) satisfaction score for each question under both the Traditional Method and the Agile Framework conditions.
  • Overall Comparison:
    • The average W.A. for the Agile Framework was 4.25. Based on the Likert scale (4.20-5.00 = Enriching), team satisfaction with the Agile framework was rated as Enriching.
    • The average W.A. for the Traditional Method was 2.39. Based on the Likert scale (1.80-2.59 = Accepted), team satisfaction with the traditional method was rated as merely Accepted.
  • Specific Differences (Slide 81 Chart): The bar chart visually confirms that the Agile Framework consistently scored much higher across almost all aspects compared to the Traditional Method. For example, collaboration (within team and with customer), adequacy of time, self-organization, conflict resolution, punctuality, assistance to development, and future applicability were all perceived much more positively under the Agile framework.

Conclusion: The evaluation strongly indicated that the development team members found the proposed Agile framework significantly more satisfying to work with compared to traditional methods, particularly regarding collaboration, flexibility, self-organization, and overall effectiveness.

Slide 12 explicitly lists several key characteristics that define Agile development approaches:

  • Modularity: Development is structured around manageable modules or components, likely applied at the process level (breaking work down).
  • Iterative: Work proceeds in short cycles (iterations or sprints). Each cycle produces a potentially usable increment of the software, allowing for repetition and refinement.
  • Short Cycles: These iterations enable fast verification of work and quick correction of issues or misunderstandings.
  • Time-bound: Iterations have a fixed duration, typically ranging from one to six weeks, providing a regular cadence and predictability.
  • Adaptive: Agile methods are designed to accommodate change and respond to emergent risks or new requirements discovered during development.
  • People-oriented: Emphasis is placed on individuals, collaboration, skills, and communication rather than strictly on processes and tools.
  • Collaborative: Promotes close cooperation among team members and between the team and stakeholders (like the customer).
  • Communicative: Relies on frequent and clear communication as a primary mechanism for coordination and information sharing.

These characteristics collectively contribute to Agile's ability to deliver value incrementally, respond to change effectively, and maintain a focus on customer satisfaction and working software (as implied by advantages on Slides 13 & 14).

Slide 15 presents a diagram illustrating a core difference in how Agile and Traditional methodologies handle project constraints, specifically Functionality, Time, and Resources.

The diagram shows the following trade-off:

  • Traditional Methodologies:
    • Fixed Element: Functionality (Scope). The goal is typically to deliver a predefined set of features agreed upon at the beginning.
    • Flexible Elements: Time and Resources. If challenges arise or the scope proves larger than expected, the schedule (Time) and/or budget/team size (Resources) are often adjusted to ensure the fixed functionality is delivered.
  • Agile Methodologies:
    • Fixed Elements: Time (iterations/sprints have fixed length) and Resources (team size is usually stable during an iteration).
    • Flexible Element: Functionality (Scope). Within a fixed timebox (sprint) and with a fixed team, the amount of functionality that can be delivered is variable. Priority is given to delivering the most valuable features first, and the scope for later iterations can be adjusted based on feedback and progress.

In essence: Traditional methods aim to deliver a fixed scope, potentially allowing time and cost to vary. Agile methods aim to deliver value within fixed time and cost increments (sprints), allowing the scope to be flexible and adaptable over the course of the project. This flexibility in scope is a key reason why Agile is considered well-suited for projects with uncertain or changing requirements, like many E-Government initiatives.

Fill in the Blank Questions

True/False Questions

Multiple Choice Questions