[CF-Devel] Fwd: open source quality assurance

Bob Tanner tanner at real-time.com
Mon Jan 14 16:28:35 CST 2002


Thought I'd share this with the rest of you.

----- Forwarded message from Luyin Zhao <
     
     lzhao at cse.unl.edu
     
     > -----

Dear Bob Tanner,
My professor S. Elbaum (Univ. of Nebraska-Lincoln) and I
are investigating how quality assurance activities are 
carried in open source development, and what differences
can be found when compared against more traditional 
industry practices. A pilot study was performed and 
published ("A Survey on Software Quality Related Activities
in Open Source", Software Engineering Notes, ACM, 54-57, 
June 2000). We are extending the study through the attached 
survey (22 multiple choise questions), which will provide 
additional empirical evidence exposing the unique 
characteristics of the open source development movement.
We hope you can take a few minutes to complete it basd on 
your open source project 'Crossfire RPG game' released on 
SourceForge.net.
Your effort is greatly appreciated!

Luyin Zhao


Survey Questionnaire:

Part A: Project characterization

1. What is the estimated number of lines of code of the 
project?
A. < 1,000
B. 1,000 - 10,000
C. 10,000 - 100,000
D. > 100,000 Lines of code

Response:


2. How many software developers are actively involved in 
this project?
A. 1
B. 1-5
C. 5-20
D. +20

Response:


3. What is the estimated current number of users of this 
product?
A. 1-5
B. 5-10
C. 10-50
D. +50

Response:


4. How often are the product releases (on average)?
A. Every week
B. Every month
C. Every quarter
D. Every 6 months
E. Other

Response:


5. How long has the product been available in the market?
A. Less than 6 months
B. Between 6 months and a year
C. Between 1 and 3 years
D. More than 3 years

Response:


Part B: Respondent characterization

6. Software development experience
A. <1 year
B. 1-5 years
C. +5 years

Response:


7. What level of participation do you have in the project?
A. Dedicated full-time 
B. Part-time, supported by employer
C. Part-time, personal time 
D. Other

Response:


Part C: Process

8. Did the project start to satisfy:
A. Personal needs
B. Company needs
C. Community needs
D. Other

Response:


9. What percentage of your product changes from release to 
release (major releases)?
A. < 20 %
B. 20% - 40%
C. 40% - 60%  
D. 60% - 80%
E. > 80%

Response:


10. Do you use software configuration management tools
 (version control tools) ?
A. A. Yes (Name          )
B. No
C. Not sure

Response:


11. Do you use any "bug" tracking tool?
A. Yes (Name          )
B.  No
C. Not sure

Response:


12. Which of the following documents is used to support the 
project?
A. Document to plan releases (dates and content)
B. Design document
C. Installation and building guidelines
D. "TODO" List (including list of pending features and open 
bugs)

Response:


Part D: Testing

13. How do you validate your product before release?
A. Provide inputs trying to imitate user behavior (ad-hoc)
B. Use script to provide random values as inputs
C. Provide extreme values as inputs
D. Use assertions (assert, Junit, others)
E. Other

Response:


14. What percentage of your time and effort is spent on 
testing?
A. < 20 %
B. 20% - 40%
C. 40% - 60%  
D. 60% - 80%
E. > 80%

Response:


15. Do you have a "baseline" test suite that you re-run on 
your software before every release?
A. Yes 
B. No

Response:


16. What percentage of source code is covered by the 
testing activity?
A. < 20 %
B. 20% - 40%
C. 40% - 60%  
D. 60% - 80%
E. > 80%

Response:


17. The previous coverage information was based on: 
A. Reports by coverage tool (Name it: __________)
B. Personal estimation

Response:


Part E: Users Participation and Feedback

18. How soon after release do you hear back from users?
A. Hours
B. Days
C. Weeks

Response:


19. What percentage of "bugs" did users find?
A. < 20%
B. 20% - 40%
C. 40% - 60%  
D. 60 - 80%
E. >80%

Response:


20. What percentage of code has changed in response to 
users suggestions?
A. < 20%
B. 20% - 40%
C. 40% - 60%  
D. 60% - 80%
E. > 80%

Response:


21. How do you evaluate the "bug" locating effectiveness of 
external users?
A. They found "hard" bugs that could have taken us a long 
time to find
B. Given some more time, I would have found most of them
C. They don't help too much 
D. Other

Response:

22. The modifications suggested by users, are:
A. Very creative 
B. Reasonable 
C. Useful but not so necessary 
D. Not fitting into my application design   
E. Other

Response:




----- End forwarded message -----

-- 
Bob Tanner <
     
     tanner at real-time.com
     
     >         | Phone : (952)943-8700
     
     http://www.mn-linux.org,
     
      Minnesota, Linux | Fax   : (952)943-8500
Key fingerprint =  6C E9 51 4F D5 3E 4C 66 62 A9 10 E5 35 85 39 D9 


    
    


More information about the crossfire mailing list