Many businesses do not use the Software Development Life Cycle. What is a likely explanation?

I believe one of the reasons many businesses do not use the Software Development Life Cycle is due to lack of awareness that such a process exists. I have experienced this first hand from my own attempts at running projects early in my career. My first positions were in smaller companies where less formality was necessary. Those experiences with planning and running projects were for small tasks. I would do development for the solution, and I would work with at most three other people. The other technical people working on the task would test or help out in other aspects. Most decisions were made at the office kitchen table and little needed to be recorded in documents. Once an awareness of the Software Development Life Cycle exists, the next problem is how to go about producing the artifacts that are necessary to project success. We have been given a very wide and shallow introduction to the System Development Life Cycle. This is good if you never have seen this before in your life. I have specific questions now about what is needed to be documented in each project phase. For instance, what do good requirements look like? Seeing templates or samples of existing project documents would speed an immature team’s understanding of where formality can stabilize a project and how much overhead formal process will add to the project time line. The final possibility that may contribute to lack formal processes for development is that business management does not believe it brings as many benefits compared to the existing system of project management. The SDLC affects not only the Information Technology team, but the departments receiving and affected by the creation of an automated solution to a business problem. I found an article on the web, talking about formal methods of engineering. I think this paragraph brings some insight into the business decision of bringing formality to an existing business infrastructure: “The decision to use a new methodology is driven by economics: Do the benefits of the new method exceed the costs of converting to it and using it by a sufficient margin to justify the risks of doing so?” [1] [1] An Overview of Systems Design and Development Methodologies with Regard to the Involvement of Users and Other Stakeholders, SHAWREN SINGH AND PAULA KOTZ, University of South Africa ...

August 8, 2005 · 2 min · 389 words · Jim Thario

What is the most important phase of the Software Development Life Cycle (SDLC)?

In my experience developing software, I find the most important phase is the Elaboration phase. The reason I feel it is the most important is because it is the join point between the definition of the business problem and the construction of the solution. During the Inception phase, you baseline your vision and solution to a problem and you make the business case for building it. At the end of the Inception phase you should have support and funding from the business to move forward. Everyone involved with the project should be in agreement about what the team is trying to build. If there is any misinterpretation, especially from your funding source, you need deal with it here. The Elaboration phase is when the technical solution is determined - not actually built. This phase is the clarification phase. There is modeling, risk analysis, prototyping and refining of the requirements. This phase is when you find out if the solution can actually be built. You leave the elaboration phase with an architecture on paper (or in a modeling tool) and something that runs just enough that can prove the system can be completed successfully. I believe this is the point where funding really needs to kick in. During Construction and Transition, headcount is being adding in the form of developers, testers, documentation writers, test engineering, release engineering, legal, etc. You are beginning to train the trainers, the sales staff, and the consultants. If the project is not going to succeed, it is in your best interest to kill it off before you begin construction phase. ...

August 8, 2005 · 2 min · 264 words · Jim Thario

Murphy

This is Murphy. He is a Rottweiler mix - mostly mixed with love and anti-seizure medication. From Dogs Photo by www.nicolehowardphotography.com

August 7, 2005 · 1 min · 21 words · Jim Thario

Cancun between hurricanes

Here is a small photo album from our trip. Cancun 2005

August 7, 2005 · 1 min · 11 words · Jim Thario

What are the elements of a good Web page design?

I think this can be answered from the user’s perspective and from the developer’s perspective. I think a page can be considered well designed if it looks good, works with many browsers, and can be maintained by others than the original author. From the user’s perspective, I was able to come up with the following list: Accessibility - the page is compatible with screen readers and alternate input devices. At work we recently went through a remediation process with one of our web sites. We needed to assure HR the site was compatible with accessibility utilities. I think about 75% of this can be handled by writing good HTML source. In addition to this, testing tools such as WebKing can help identify other problems that can prevent the web code from working in certain situations. Navigation - the page is easy to leave. Another way to say it is the page should have the necessary links to navigate away to other major areas, if it is part of a larger web site. Placement - the page is easy to find in the site and navigate to. Compatibility - the page can be loaded and properly displayed in popular browsers. I think in e-commerce, it is important to give this item some amount of priority. You want to encourage visitors to browse and buy regardless of the specific brand or version of their technical resources. This is also important to consider if your viewer base consists of users with handhelds or Internet-capable cell phones. Organization - information on the page is presented in a visually appealing way, including text style choice and page positioning. From the developer’s perspective: Documentation - comments in the code or a short design note helps the author remember what they did and helps other maintain the page later. Organization - the page’s source is consistently organized and formatted into blocks. I think with today’s tools that can reformat source code, this is less of a problem. ...

August 7, 2005 · 2 min · 330 words · Jim Thario

Name two differences between designing for a Web page and for print-based media

The difference that draws my attention is that print media is static - ink or other compound is bonded to a page and is permanently fixed. Unlike a web site, there is no hope of that print jumping up and rearranging itself if the user wants to see a different layout. The first difference is that web publishing has the possibility of introducing dynamic content to the user in a number of different ways. Web sites used for e-commerce have the ability to show customized content based on the user’s past purchase history, or if they have a particular preference for how the page is arranged. My Yahoo is another example, where each user can have a customized view of information they choose. The other primary difference I can think of between web and print media is that designing for a newspaper, for example, is a controlled process from end to end, unlike a web page in which the rendering and quality of the final product is out of the control of the publisher of the content. The newspaper publisher chooses layout, fonts and other aspects of style just list a web published would, but the similarity stops there. A print publisher also chooses the rendering mechanism and the paper it is printed on. In web publishing, that last step is somewhat variable in that browser differences have the possibility of producing different output with the same HTML code. ...

August 7, 2005 · 2 min · 239 words · Jim Thario

Self destructing servers

I had an idea today about how to make servers self destruct in case of some type of security breach. I guess this might be influenced by the Star Trek movie I saw the other night. They seem to blow up more Enterprises in the recent stories. My idea is to keep a blank CD-R in the drive of the server at all times. On hard disk there is an ISO file that is written to the CD-R on demand and then the server is rebooted. The server will ignore the blank CD-R during reboots until it is written with a valid image. The contents of the ISO needs to be a boot loader and kernel, like Grub and Linux plus a file system with a wipe program. The wipe program is started once the kernel is booted and it iterates through the collection of hard drives, which the kernel found during the boot process, and writes over them with a pattern.This kind of the self destruct sequence can be automated with a script and invoked through a terminal on the local network or through a VPN. It could also be loaded into cron and deactivated on a regular basis from going off.So, if your servers are under heavy attack, and you have no other choice, start the count down. :-) ...

June 9, 2005 · 2 min · 221 words · Jim Thario

School work

I graduate in November and then I can grow up and get a job.I have been attending a UNIX course in school the past few weeks. This week we have been studying some cost configurations in running UNIX and Linux for various network serving roles. A topic that came up was the benefit of using the free Linux distributions and related software for low cost server operations. I have a home network, and I think I count as a low cost operation. I will not spend excessive money on my network, and I have never felt compelled to spend money because software I need could be obtained for free.For example, my primary server at home routes email, serves several web sites, and acts as a router between the public Internet and my home network. It is a big server. I run Fedora Core 3 as my operating system.The email routing incorporates dovecot, sendmail, amavisd-new, SpamAssassin, and ClamAV. The last three of these programs working in tandem keep dangerous email for passing through my server. The spam analyzer learns the difference betweeen wanted and unwanted email, while the open source ClamAV scanner automatically checks for updated virus signatures every hour. The amavisd program acts as the mediator between the spam and virus services and my email server. The best part is that tainted email is rejected in real time while the sender is trying to move it to my server.As a network router, my giant egg basket of a server watches both incoming and outgoing connections for suspicious activity on all network adapters using Snort.What would I pay to recreate this configuration with commercial software? ...

June 4, 2005 · 2 min · 274 words · Jim Thario

Old pictures of a warm place

Jen and I spent our 10th Anniversary in Hawaii two years ago. 10th Anniversary in Hawaii

June 4, 2005 · 1 min · 16 words · Jim Thario

First entry

The first entry is dedicated to my big dog. We miss you. From www.thario.net

June 4, 2005 · 1 min · 14 words · Jim Thario