Running Thoughts

Tim Bauer's webcast summaries/insights

CMMI = Cant Manufacture Much Innovation

Not a running thoughts post … just a wandering side comment.

I was reviewing Joe Arnold’s post on why product companies, like Yahoo, don’t use CMMI. He raised an interesting point by basically saying CMMI works in operational models but NOT in innovation models. First he talked about the focus/application of CMMI:

    “A high CMMI-level rating, while it decreases risk, also has overhead that adds to costs. The higher the level of certainty, the higher the cost.”

Who wants lower risk more than systems in an operational steady state? That is what people who buy based on CMMI rating are looking for. You don’t innovate on your mail server, your financial system, or your CRM app. You want stability with slow evolution on those. Stay at par w/ competitors. But, if you have aspects of technology that drives your competitive advantage, you aren’t looking for stability to be the watchword of how you deliver IT solutions. First and foremost you have to innovate. Joe points out (with Yahoo as his inferred backdrop) that innovation doesn’t come from CMMI … it is smothered:

    “Competition [for companies that thrive on innovation] from startups and other internet companies is fierce. Time to market is critical. Persuing a CMMI level that would add any costs to product development isn’t acceptable.”

He then goes on to say how Agile and CMMI can co-exist (olive branch) but leaves on the table that innovation isn’t happening at the same pace in a CMMI shop (not compared to a non-CMMI shop).

All of the above is interesting, to me, in two ways:

    1. You Need Dr. Jeckl and Mr. Hyde in Corporate Shops When you look at your operations you really need to think about two types of teams. One, A group of your “Rangers” that live to create innovation where you need it with management models, incentive plans, methodologies, and planning cycles that match that style. Surrounding the “Rangers” are your stable “core” resources. Players incented by stability and rigor with, again, matching methods, incentives, and methodologies. They need to both exist or you will end up being one but not the other.
    2. The U.S. role in the global IT market As cost arbitrage continues the US IT worker should look to the role of (1) innovator or (2) lead in partially outsourced “core” apps. Being the innovator (finding the technologies that drive market advantage for the company and aligning with them) is a wiser play over time as full outsourcing of core apps to CMMI players is always an option.

Very interesting. Thanks for the insight Joe.

March 25, 2008 Posted by | 3-No Go (unless you are bored) | , , | 1 Comment

Proj Mgmt: Agile Innovation

So I watched this webcast this morning.

Agile Project Management – Reliable Innovation

Recently (last 4 to 5 qtrs) we have been approached by clients that are trying to apply agile in thier environments. With that in mind I am beginning to be more curious on what the ‘pioneers’ in this space are doing.

At a high level I took this away:

    1. Agile Doesn’t Mean Agile All the Time It was interesting how the speaker had a variety of points where he made sure to clarify Agile isn’t agile all the time. Its about being flexible in how you deliver primarily around the uncertainty in the project. More uncertainty more agile. Less uncertainty more classical (still pieces of agile). Good message.

    2. They still avoid that budget thing The metrics quoted by this guy were about match to business goals … not about cost to achieve them. The unanswered question is … did we achieve 100% match but spend 80% of the budget (w/ overrung) to achieve the last 20%? Does that last 20% really drive the 80% of the value? Our estimating model does a better job (could be applied in agile structures) of showing the cost associated with various features. From which clients could assess lining out specific areas (lacking payback).

    3. Liked Thier Parking Lot Status They showed an example of a parking lot status. If some watches snapshot it. Basically it was a 10,000 foot view of all the subject areas, activities in those, sub activites (metrics on stories that are ‘done done’) with stoplighting at the activity level. Could do that with process views as well.

Here are my detail notes for those inclined.

Quote:
• Road to Agile Series
…………….Avoiding the Four Roadblocks to Agile Adoption
…………….The Agile Customer Toolkit
…………….Moving from Test-Last to Test-Driven.
…………….Agile Project Management – Reliable Innovation
• 3:00 – Presenter – Jim HighSmith
…………….○ Author, 25 yrs, Co Author of Agile Manifesto
• 5:00 – Agile
…………….○ Deliver value
…………….○ Have a good time
• 5:30 – Objectives
…………….○ Realibility / Value Via
…………………………..§ Continous Innovation
…………………………..§ Product Adaptability
…………………………..§ Reduced Delivery Schedules
…………………………..§ People / Process Adaptability
…………………………..§ Reliable Results
• 8:00 – Example – Tablet PC Graphics Package
…………….○ Envision and Evolve project (not defined and locked down)
…………….○ 3 stories: Version 1, Tech Debt, Agility
• 9:45 – Essence of APM (Agile Project Management)
…………….○ Innovation
…………….○ Get Right People Delivering
…………….○ Core Values & Principles (ala Agile Manifesto)
…………….○ Process Framework / Practices (evision, speculate, adapt, explore, close)
• 11:30 – Characteristics
…………….○ Vision / Customer Value Driven
…………….○ Key usage is projects w/ uncertainty of goals
…………………………..§ Change over: Time, Understanding, Cone of Uncertainty
…………………………..§ Respond to Change is critical
…………….○ Feature Driven Development
…………………………..§ Features are “done done” (meaning it is developed and accepted)
…………….○ Iterative Development
…………….○ Collaborative Development
• 15:00 –
…………….○ Client producing documentation … extensive but not formalized.
…………….○ Collaborative development sessions
…………….○ Peer programming (team areas and cube workspaces). Most of time in workspaces
• 16:30 – Envision, Explore
…………….○ Colloborative Planning, Development
…………….○ Envision, multiple Explores … envision … cycle
• 17:30 – Experimentation
…………….○ Harvard Business School – 2003 … Experimentation is key. Organizations impede it. Examples of how experimentation driven down cost (computer simulations, virtual crashes, 30% advantages in design).
• 20:00 – Problem Domains
…………….○ Optimization – Production Drilling for example
…………….○ Adaptation – Exploration for oil
• Exploration Factor –
…………….○ Requirements Vector: Erratic, flux, routine, stable
…………….○ Tech vector: Bleeding edge, leading, familiar, well known
• 22:30 – Manage “Uncertain Projects” via adaptive and measure success differently.
…………….○ Manage to vision on adaptive
…………….○ Manage to budget on optimization
• 23:30 – Stages of Project Management
…………….○ 1 – Chaos
…………….○ 2 – Prescriptive Control
…………………………..§ Plan work. Work plan.
…………………………..§ Waterfall
…………….○ 3 – Adaptive Control
…………………………..§ Comformance to end goal (not plan)
…………………………………………□ [bauer thought – end goal is sometime budget w/ features]
• 25:00 – Parking Lot Diagram
…………….○ Status report (12 years)
…………….○ Subject areas macro
…………….○ Activity areas micro
…………….○ Sub Activity (metrics – # stories complete)
…………….○ [bauer thought – stories could equal classic processes]
• 27:00 – What is agility
…………….○ Create change
…………….○ Respond to change
…………….○ Balance of flex and structure
• 28:30 – APM Guiding Principals
…………….○ Customer-Product
…………………………..§ Deliver customer value
…………………………..§ Champion Tech excellence
…………………………..§ Employ iterative feature delivery
…………….○ Leadership-Collaboration
…………………………..§ Build adaptive teams
…………………………..§ Inspire Exploration
…………………………..§ Simplify
• 29:30 – Help teams define their ‘own’ values
…………….○ Speak to interaction
…………….○ Speak to how decision process
• 30:30 – Q&A
…………….○ Barely Sufficient definition?
…………………………..§ What is docs / rigor that is ‘barely’ acceptable for that client (informal docs to rigor)
…………….○ Hard ROI on Agile versus classic?
…………………………..§ Internal study shows that new agile projects … industry norms … both more productive. Order of magnitude of few defects
…………………………..§ [bauer note: no metrics on budget performance difference]
• 33:45 – APM Lifecycle
…………….○ Envision
…………………………..§ Product Vision Box
…………………………………………□ Example – Create a vision of the product. Sell your produce on a box. Essence of product of what the customer wants.
…………………………………………□ Elevator pitch model
…………………………..§ Project Data Sheet (Scope)
…………………………..§ Project Community (Project Mgr/Dev, Product Mgr/Customers)
…………………………..§ Serial to Iterative
…………………………………………□ Iteration 0 (arch framework, dev requirements)
…………………………..§ Balance Anticipation / Adaptation
…………………………………………□ A lot of Technical depth drives it to be unmaintainable
…………………………..§ Feature Cards (Documentation)
…………………………………………□ Conversation level (not a requirements spec)
…………………………………………□ Confirmation is acceptance test)
…………………………………………□ Pink cards are iterations, stack iterations below
…………….○ Speculate
…………….○ Adapt
…………….○ Explore
…………………………..§ Exploration planning, development, review / adapt w/ customer
…………………………..§ Iteration planning …
…………….○ Close
…………………………..§ Team Retrospectives. Team, product, process. What Went Well (WWW). Concerns / Improvements. Questions.
• Command – Control … to … Leadership / Collaboration
…………….○ “Simple, clear purpose principles give risk to complex, intelligent behavior”
…………….○ Build adaptive teams … self discipline, self organization
…………….○ Manager Role
…………………………..§ Right people
…………………………..§ Barely sufficient framework
…………………………..§ Open flows
…………………………..§ Steer don’t control
…………………………..§ Share decision making
…………….○ Team Members Role
…………………………..§ Accountability for:
…………………………………………□ Results
…………………………………………□ Relationships
…………………………..§ Confront reality with rigor thinking
…………………………..§ Engage in intense interaction
• Gnatt example
…………….○ Waterfall. No feedback. % complete on artifacts.
…………….○ Agile. Plan, collaborate, development. Feedback and control mech is shippable control features (as shown by parking lot)
• Resources
…………….○ agilepmgroup@yahoogroups.com
…………….○ www.adaptivesd.com (his site)
• 52:00 – Q&A
…………….○ Measure Productivity in Agile?
…………………………..§ Function points
…………………………..§ Lines of code (bad)
…………………………..§ Problem in measurement in agile … what is the measure of delivering but being zero value. Agile insures that unnecessary features are cut out via process
…………….○ Don’t keep detailed documentation?
…………………………..§ Clarify.
…………………………………………□ Format is different. Documentation is there. Flip charts. Digital images.
…………………………………………□ Timing might be different (sometimes formalized at end)
…………….○ Planning?
…………………………..§ Release planning (all project)
…………………………………………□ Same estimating techniques
…………………………………………□ High uncertainty … problem
…………………………………………□ Put graph in of time and % of certainty.
…………………………..§ Iteration planning
…………….○ How agile should project be?
…………………………..§ What is uncertainty? But … replace existing system had change and requirements (people wanted new stuff). Difficult to dig out code features (more documentation). Still be adaptive and agile (iterative dev, planning)

June 9, 2006 Posted by | Uncategorized | , | Leave a comment

MSFT: VSTS – Measure Projects

Watched this webcast today

Using Metrics from Visual Studio 2005 Team Foundation Server (TFS) to Manage and Troubleshoot Your Projects (Level 200)

given our current clients that are looking at TFS and/or asking us to help implement it. It was a nice run by of various reports and how MSFT sees them enabling the project management group. The keys I took away, however, were:

1. AVOID THE WORD AGILE? It is interesting to me that they specifically never said agile in this presentation … while the entire basis of thier demo was thier CMMI process (agile delivery approach). They referred to it as “Value Up”. I assume this is coming from thier PM SME group. Stop fighting the religious war over the word ‘Agile’. Take the concept and roll out the parts that make sense. Intriguing. Very intriguing. We are contemplating a similar move on the SCM front and Usability fronts.
2. SINGLE REPORTING VISION? It got me back on my horse that we need to have a singular reporting vision. Recent attempts at TFS rollouts by our teams are going with the angle of one project plan for the delivery team in TFS and allowing the PM group to have a seperate one. Watching this it is very clear with some deep thought the two can be integrated. Allow the dev team to work on infinite granualar tasks that ALL tie to a super set of project tasks (be it CMMI or another delivery method).

Here are my rough notes from the presentation:

Quote:
Tom Patton – Program Manager for Team Foundation Server
…………○ 0:30 – Typical measures: Qlty, functionality, resources, time
…………○ 1:30 – Two paridigms
……………………§ Work down (up front planning)
……………………§ Value up (iterative, agile)
……………………§ 2:00 comparison of the two (planning, chang mgmt, measure focus, qlty def, variance acceptance, work products, troubleshooting, approach to trust)
…………○ 5:30 – Performance impacted by metrics (chart. Focus becomes metrics not ture end goal … in this case … great software.
…………○ 7:00 – Friction Free Metrics in Team Foundation
……………………§ Unobtrusive collection
……………………§ Retain full history
……………………§ View metrics as they were in the past
……………………§ Changes as your process changes
…………○ 10:30 — DEMO – How metrics are selected in Team System
……………………§ 11:00 Team Explorer control (bugs, issues, task items, customer requirements, dev tasks, migration action status, etc)….see how tasks or work items relate to other items … as program mgr you can see requirements related
……………………§ (they are demo’ing from the CMMI templates)
……………………§ 16:00 — Talk to build functionality, bugs fixed, functionality impacted, test results against that build
…………○ 17:00 – Arch Overview
……………………§ Team Foundation DW System
………………………………□ Relational DW
………………………………□ OLAP Cube to render to players on project
………………………………□ Reports built on SQLServer and xls
………………………………□ Pulls from
…………………………………………¨ People, Dates, Projects
…………………………………………¨ Work Item Tracking
…………………………………………¨ Version control
…………………………………………¨ Team Build
…………………………………………¨ Team Test
…………………………………………¨ 3rd Party data sources
…………○ 19:00 – Value Up Questions
……………………§ Which requirements have been tested
……………………§ What is the quality
……………………§ Where do rsrcs do need to focus
……………………§ How far can we get in avail time
……………………§ How effective is our outsourced team
……………………§ How demonstrate practices for audit\
……………………§ How did the team perf trend?
…………○ 21:45 – Shipped Reports
……………………§ From team explorer, project portal, doc lib, report mgr
…………○ 22:30 – Demo 2 – Reports
……………………§ Team Explorer
………………………………□ Build report
………………………………□ Testing (regression test review)
……………………§ Portal
………………………………□ Various report views
………………………………□ CMMI reports.
…………………………………………¨ Iteration comparison page
…………………………………………¨ Remaining work (ie number of bugs)
……………………………………………………◊ Open, activations, closures
…………………………………………¨ Issue/Task Work Item reporting
……………………………………………………◊ Blocked tasks, versus issues … associate
…………………………………………¨ Friction free is just reporting on what goes on day to tday
…………………………………………¨ Project Velocity
…………………………………………¨ Reactivations. Defects carrying at any time … organge are reactivations (bugs reopened … poor testing)
………………………………□ 33:00 Front Page (Portal) Reports
…………………………………………¨ Quality Indicator Report (on builds)
……………………………………………………◊ Test / pass rate, code churn … degree of code change build to build (relative risk), purple line is code statistics.
…………………………………………¨ Customization might be to drill down drom one report
…………………………………………¨ Files modified by date, by person, by director, by type, by priority,
…………○ 41:15 – DEMO – Authoring Reports
……………………§ What can work in XLS
………………………………□ Ability to make reports on the data in xls
………………………………□ PIVOT table
………………………………□ Pull data from Team System DW
………………………………□ Slice by state. Activations.
……………………§ Report Designer
………………………………□ 47:00 – Deploy to reporting services
………………………………□ Show under reports node off team explorere
………………………………□ Its in Team System
…………○ 48:00 – Q&A
……………………§ Documentation?
………………………………□ Look in process guidance / template
………………………………□ Team portal. Description of reports from link there
……………………§ Include xls chart in reporting svcs report?
………………………………□ No. Adhoc is xls. Use reporting services
……………………§ Look at changed code in report?
………………………………□ Unknown. Change set is known. Diff tools could help it. It would be work.

May 6, 2006 Posted by | Uncategorized | , , | Leave a comment