Plp- Bings Competitive Advantages

“Bing” it on Google Introduction Mark Penn is taking a big leap by leaving his firm to work for Microsoft on a mission to fix Bing. The company is faced with becoming a competitive leader in the search engine area. Penn believes he can enter Microsoft with a different strategy. To improve Bing, Penn believes he needs Stack Ranking, which focuses product developers away from getting industry-leading products to market faster than the competition. According to the article, not including the marketing or the billions of dollars put into Bing, Google accounted for 69% of the searches in June alone.

They say the key strategy to turn this company around would be to come up with an approach that would make Bing a different kind of search engine compared to Google. Analysis As stated above, Microsoft is trying to take the number one spot for the most used search engine. A competitive advantage can be defined as a firm’s ability to create value in a way that its rivals cannot. Microsoft and Yahoo! introduced Bing in 2009, which allows users to search for information regarding almost anything. While being introduced to everyone in 2009 other search engines were available.

Bing’s competitive advantage over the others was that Bing offered subcategories onto the organic search results, allowing the user to quickly see the search results in logical groups. This is an issue for Bing because they are not the only company that offers these types of search engine results and no longer have a competitive advantage in the market on this basis. Another advantage with Bing, you get “enhanced results” which can also be taken as intelligently organized results that you can receive quickly and efficiently.

Because of these enhanced results, many people chose to use the Bing search engine over Google but in recent years, other search engines have put a greater emphasis on their speed and results and are bypassing Bing. If Bing still had any competitive advantage, it would be that it offers enhanced results in search engines, however it has been proven for most businesses that use other competitors’ search engines their speed and results are sufficient and they prefer them to Bing.

This is evident which stated before that 69% of the search engine users chose Google over the 25. 6% for Bing. What hurt the Microsoft Company is that Bing was doing so poorly to the point they offered Penn a position to help “fix” Bing. This could make or break Bing seeing whether Penn can differentiate this search engine compared to the others. This will be difficult for Penn seeing that he lacks search engine product development expertise.

No matter how speedy this search engine is or how well the results are, if Penn does not find a way to improve Bings market share this will be just a fad that the Microsoft has gone through and Google will continue doing well. Conclusions To be successful in any market you need to be able to compete and this is where Bing falls short. In order to make this search engine more successful Microsoft needs to come up with a more sufficient business strategy. They need to figure out where they best fit in and differentiate themselves from other companies.

They also need to figure out their goals, and objectives, which will make it easier for them to target their users. If they were to apply their business to the VRINE model, they would quickly notice that their search engine is replaceable and has no ways to set their company’s search engines apart from others. I believe they are taking the right steps by trying to offer different options on their website but I think they will need to do more since they do not offer the same kind of links that Google does such as Gmail or Google Maps.

I think that while Mark Penn is focusing on making Bing better in the market share area and the rest of Microsoft management needs to place a strong focus on finding innovators that can help Bing become a major competitor once again. Title- Can Mark Penn Fix Microsoft’s Bing? Date-7/23/2012 Website-http://www. forbes. com/sites/petercohan/2012/07/23/can-mark-penn-fix-microsofts-bing/2/ Citation-Cohan, Peter. “Can Mark Penn Fix Microsoft’s Bing? ” Forbes. Forbes Magazine, 23 July 2012. Web. 09 Nov. 2012. .

Read more

Strauss vs. Microsoft Corp

In 2003, the United States government created what it termed as the National Strategy to Secure Cyberspace. This strategy was composed of five components that are as follows in direct quote:

  • a cyberspace security response system
  • a network through which private sector and government organizations can pool information about vulnerabilities, threats, and attacks in order to facilitate timely joint action;
  •  a cyberspace security threat and vulnerability reduction program, consisting of various initiatives to identify people and organizations that might attack U.S. information systems and to take appropriate action in response;

A cyberspace security awareness and training program, consisting of several initiatives to make the public more vigilant against cyberthreats and to train personnel skilled in taking preventive measures;  an initiative to secure the governments’ cyberspace, which includes programs that state and federal government agencies will take to protect their own information systems; and  national security and international cyberspace security cooperation–initiatives to ensure that federal government agencies work effectively together and that the U. S. government works effectively with foreign governments.

This was a true wake up call to the IT (Information Technology) companies such as Microsoft. As each year advanced, the cost to keep this security risk mounted, estimating into the billions of dollars. Yet, the effectiveness of this situation had produced mostly unsatisfactory results, according to the FBI-sponsored survey taken by CSI (Computer Security Institute) in 2002. After the survey, it was noted that fewer of respondents to this survey actually could show proof that they had acquired enough financial offset to justify the cost of the programs.

The CSI survey was regarded as statistically favored in certain respondents. Its validity was questioned. After September 11, 2001, the demand for improved internet security tripled and even people who were not “internet savvy” became well aware of the threats to their livelihoods and their lives. It was a clarified statement that just four possible viruses could cause more estimated financial damage that a natural disaster or terrorist attack on a major city or government facility.

Yet did Microsoft step up to the challenge?

One factor of internet security that is of major importance became the effect of e-mail, both in personal and business use. Since most large companies had moved into almost complete computer controlled business aspects, any type of office worker had access to at least one computer or more. Companies found that their employees were accessing non-work related web sites and also using the company e-mail addresses for personal use, therefore, reducing the employee’s productivity while costing the company labor hours with no profitable use.

A call went up for monitoring systems to supervise such wanton activity as employee viewing of porno sites and also using their business computers to chat with friends and family. The business e-mail set became the fastest way to have inner office response such as memos or contact from supervisor to employee. Microsoft developed a program to log a record of all e-mail use so that the company management could have a printed record of what passed through their computers. It worked to their advantage and disadvantage as it was a double sided blade.

It allowed the supervisory staff to catch employees abusing their computer privileges which could lead to write ups and eventual dismissals but it also caught many upper staff management plus other employees in the act of sexual harassment of other employees. This also lead to cases of racial discrimination and personal harassment of certain employees and could be used as a means to prove these cases when the employee felt they were denied promotion on the aspects of race, gender, age or sexual preference. Such activities led to a lawsuit being filed against Microsoft for just these various reasons.

In the 1995 case of Strauss vs. Microsoft Corp. (U. S. District Court for the Southern District of New York), a female employee of a Microsoft publication claimed that her supervisor made offensive messages and sexually explicit e-mails to her. Microsoft stated that the messages were intended to be funny and not meant to be offensive. Microsoft lost and the court’s decision had a resounding effect on similar cases that were filed afterwards. Companies discovered that monitoring an employee’s internet activities could be costly if there was any type of unbecoming behavior linked to the system.

It increased the government policy of sexual harassment awareness and companies were required to adopt almost a zero tolerant policy on the subject but the upside was that employees became more productive and spent less time in using their business computers for personal use. It did cut down on the possibility of any e-mail virus slipping past the company’s system undetected. The monitoring severely decreased the chances of sharing business material to other possible rival companies.

The Strauss lawsuit was a direct hit to Microsoft to continue to upgrade the security in their products in business. Little known is the fact that e-mail accounts can be monitored, whether business or private, such as by the government or other interested parties with the clearance to do so. So how safe is e-mail? Not nearly as most people tend to believe. Under certain circumstances, employees’ personal e-mails can be protected under the NLRA (National Labor Relations Act) when the e-mail is used to address issues in dealing with company policy and the employee’s disgruntlement of that policy.

If an objector tone is used in the e-mail to a supervisor which is an expression is displeasure over a policy, the employee can be protected under the NLRA from dismissal or any other disciplinary action. The securities for businesses that use the monitoring system are protected in many ways but also unprotected in many others. Microsoft’s development of this system has had mixed reactions but still the pros have to outweigh the cons in the overall picture.

Read more

Impact might it have on employees

How can change affect an organisation and what impact might it have on employees? How should change be managed? Discuss the factors, which must be taken into account when implementing change. An organisation can be affected in many different ways by change. I believe change to be an important corporate tool used by a manager/managers in order to achieve a greater level of efficiency. Perhaps the manager believes that the company isn’t reaching its full potential or perhaps management just feels that change is needed. The underlying fact I believe is that change happens for a reason. Why fix something if it isn’t broken? Although this statement is bold and suggests that I have only one viewpoint on change. This is not entirely true.

Perhaps a company, which is doing well, may want to implement change. Why? Why if a company were doing well would it want to change anything? Why not just keep things as they are? The answer is very simple, to increase efficiency. By increasing efficiency the company achieves a higher output. Perhaps the change may be simple, or perhaps it may be a total overhaul of the business. Whatever the change, the goal of implementation will be to help the business progress. The effects of change on an organisation I believe can be measured by the company’s efficiency, if this has improved then the change is positive, if efficiency has decreased the change might have a negative effect on any organisation.

Using change of culture as an example. Bill gates who co-owns Microsoft has managed to create perhaps the ‘most successful business machine’ evident today. A company such as Microsoft relies very heavily on its employees to maintain and help build the reputation, which Microsoft has established. A reason for this very high level of productivity at Microsoft may be the ‘culture’. Unlike many other organisations Microsoft has an informal dress code, offices are very laid-back and Communication is entirely by e-mail, this is the type of workplace, which Microsoft employees have to deal with.

Whatever the ‘culture’ Mr. Gates has created, it is working for Microsoft. The reason for using this example is simply to try and demonstrate how change from typical, formal, corporate working environments can help to breed success. Obviously working conditions like that of Microsoft would be welcomed by many, but a change which has worked for one may not work for all. Any changes which are made may not only affect the whole external outlook of the organisation and there level of success/ efficiency, it will have drastic internal repercussions on the employees “Human beings as a race are resistant to change”(Ramzi 1999).

Change is not always welcomed! Again I will come back to the point which I have already mentioned “Why fix something if it isn’t broken?” The idea of change creates anxiety within employees; perhaps the fear of change may enhance performance. But in most circumstances this is not the case, change is feared by most for many reasons. So perhaps the change isn’t reliant upon the employee’s ability to adapt, but perhaps the manager’s ability to implement this change! “The effective management of change is enhanced through careful planning, sensitive handling of the people involved and a thorough approach to implementation”. (Colin Carnell ).

Change I believe is merely an idea. Whether a culmination of a few or the vision of one, change is an idea. I believe the first step to successful management of change is to let the workforce know what your vision is. Let the employees know what the reason for change is. A large portion of the success of the ‘change’ is dependant upon the employees understanding of the final goal ‘the future of the organisation’. If they have a different idea of what the change is aiming for then perhaps it may not be successful. Another factor is the manager’s ability to convince the workforce that change is needed, and that successful change will not only benefit him, but everyone.

Employees also need to feel secure, to a certain extent. Obviously too much security in ones job may result in ‘laziness’ or lower levels of productivity. On the other hand if one feels that his job may not be secure and feels he/she is insignificant they may be working ‘on eggshells’, feeling that if they do wrong it could result in job loss. Therefore creation, vision and productivity will not be encouraged, one may become ‘inactive’ and just do the ‘bare minimum’. Obviously what I have mentioned will differ drastically from one to another as people have different opinions. Managers have different management styles, which they feel gives the best results for their business/company. For example, a manager who rules by fear (in an autocratic manner), may feel that to get the best out of the workforce they have recruited they need to have fear, and they need to be kept ‘on their toes’.

Read more

The Compare and Contrast Microsoft DOS with UNIX

As is suggestive of its name, an operating system (OS) is a collection of programs that operate the personal computer (PC). Its primary purpose is to support programs that actually do the work one is interested in, and to allow competing programs to share the resources of the computer. However, the OS also controls the inner workings of the computer, acting as a traffic manager which controls the flow of data through the system and initiates the starting and stopping processes, and as a means through which software can access the hardware and system software.

In addition, it provides routines for device control, provides for the management, scheduling and interaction of tasks, and maintains system integrity. It also provides a facility called the user interface which issues commands to the system software. Utilities are provided for managing files and documents created by users, development of programs and software, communicating between users with other computer systems and managing user requirements for programs, storage space and priority. There are a number of different types of operating systems with varying degrees of complexity.

A system such as DOS can be relatively simple and minimalistic, while others, like UNIX, can be somewhat more complicated. Some systems run only a single process at a time (DOS), while other systems run multiple processes at once (UNIX). In reality, it is not possible for a single processor to run multiple processes simultaneously. The processor of the computer runs one process for a short period of time, then is switched to the next process and so on. As the processor executes millions of instructions per second, this gives the appearance of many processes running at once.

User programs are usually stored on a hard disk and need to be loaded into memory before being executed. This presents the need for memory management, as the memory of the computer would need to be searched for a free area in which to load a users program. When the user was finished running the program, the memory consumed by it would need to be freed up and made available for another user when required (CIT). Process scheduling and management is also necessary, so that all programs can be executed and run without conflict. Some programs might need to be executed more frequently than others, for example, printing.

Conversely, some programs may need to be temporarily halted, then restarted again, so this introduces the need for inter-program communication. In modern operating systems, we speak more of a process (a portion of a program in some stage of execution (CIT, 3)) than a program. This is because only a portion of the program is loaded at any one time. The rest of the program sits waiting on the disk until it is needed, thereby saving memory space. UNIX users speak of the operating system as having three main parts: the kernel, the shell and the file system.

While DOS users tend not to use the term kernel and only sometimes use the term shell, the terms remain relevant. The kernel, also known as the “Real Time Executive”, is the low-level core of the OS and is loaded into memory right after the loading of the BIOS whenever the system is started. The kernel handles the transfer of data among the various parts of the system, such as from hard disk to RAM to CPU. It also assigns memory to the various system-level processes that occur whenever the computer does anything. The kernel is also responsible for scheduling the CPU”s operations and for letting the shell access the CPU (PC Mag, 1).

The shell is the visible user interface to the OS and is a program that loads on top of the operating system and offers users commands that lets them access the OS. Strictly speaking, the shell is an input utility that offers access to the operating system. Technically speaking, the shell, being a separate program, is not a part of the OS at all. In the UNIX world a number of shells are available, among them the Korn shell, the C-shell, the Bourne shell and the Bourne Again shell (yes, really). In DOS, the standard shell is COMMAND. COM, again nothing more than a program.

As different versions of command. com came with different versions of DOS, each added new commands and new things that could be done by the user. For example, DOS 4″s COMMAND. COM added theP switch to DEL to verify each deletion, and DOS 5″s COMMAND. COM provided the ability to sort the output of the DIR command. An acronym for disk operating system, the term DOS can refer to any operating system, but is most often used as shorthand for MS-DOS. Originally developed by Microsoft for IBM, MS-DOS was the standard operating system for IBM-compatible computers.

The initial version of DOS was somewhat uncomplicated and resembled another operating system called CP/M. Subsequent versions have become increasingly sophisticated, however DOS remains a 16-bit operating system without support for multiple users or multitasking. The earliest forms of DOS were crude and utilized only a few commands, but as computers became more advanced, so did DOS. By keeping up with technology, DOS was implemented into more “user friendly” operating systems. However, as more sophisticated operating systems were released, DOS became less important.

Today, cyberpunks involved with the latest OS trends joke that DOS stands for ‘Dad”s Operating System”” (Comerford, 23). In 1980, IBM asked the Microsoft Corporation to produce the operating system for its first personal computer, the IBM PC. Prior to this, a company called Seattle Computer Products had sold an operating system called 86-DOS to Microsoft. Microsoft hired the author of 86-DOS, Tim Paterson, in April of 1981 to modify the system, and renaming it MS-DOS (Microsoft Disk Operating System), it was released with the IBM PC.

Thereafter, most manufacturers of personal computers licensed MS-DOS as their operating system (Brittanica, 1). Limitations of the early PC”s hardware were a big influence on MS-DOS. Although the 8088 model computer had a 1Mb address space, IBM decided to allocate the first 640K of this to RAM, and the rest to ROMs, video boards and other things. Consequently, MS-DOS was set up to support programs whose maximum size was 640K. Version 1. 0 of DOS was released along with the IBM PC in August 1981. It occupied 12K of the systems 640K of memory, was somewhat compatible with CP/M and, much like CP/M, supported only a single directory.

By contrast, even the first version of UNIX had a full hierarchical file system. In addition, Version 1. 0 supported only a 160K single sided 51/4-inch floppy diskette. Version 1. 1 was released by Microsoft in October 1982 and supported double sided 320K diskettes. Aside from fixing some bugs, this release was similar to Version 1. 0. Releases such as 1. 1, in which the number to the left of the decimal point is the same as the previous version depict relatively minor changes from the previous release. By contrast, Version 2. 0 was largely a new system.

In March 1983, IBM introduced the PC/XT, its first personal computer with a hard disk. It came with a new variant of MS-DOS, Version 2. 0. In this version, Microsoft incorporated many ideas from the UNIX system for which it was also a vendor. For example, incorporating minor changes, the MS-DOS file system was taken largely from UNIX. In addition, the shell was improved, and Version 2. 0 supported a new floppy diskette format, the 360K as well as user installable device drivers, print spooling, system configuration and memory management.

At this point, MS-DOS was established as the dominant operating system in PC market. In August 1984, IBM released its first 286 chip based PC, the PC/AT. The PC/AT supported memory up to 16 Mb and had the ability to run multiple programs at once. However, the version of MS-DOS that shipped with the PC/AT was 3. 0, which supported neither of these. Rather, it ran the PC/AT in a mode that simulated the 8088, only faster. Since the PC/AT came with a 1. 2Mb disk drive, battery backup clock, and configuration information in the CMOS, support for these devices was added.

What’s more, hard disks larger that 10Mb were now supported. In addition, the command processor (shell) was removed from the operating system and made into a separate program. In November 1984, 3. 0 was replace by 3. 1 which provided the first support for networking. In 1987, IBM came out with the PS/2 line of PC which shipped with MS-DOS 3. 3, providing support for both 720K and 1. 44Mb 31/3 floppy disk drives. With Version 4. 0, Microsoft added the DOS shell, a menu driven shell rather than the previous keyboard driven ones. In addition, it now provided support for hard drives larger than 32 Mb.

A major new release, MS-DOS Version 5. 0 was shipped in April 1991. Although this was the first version that made any serious use of the extended memory, it still had the restrictions that programs could not exceed 640K. However, it had the ability to locate most of MS-DOS itself in extended memory, so about 600K of the lower 640K was now available for user programs. Version 5. 0 also came with a useful HELP utility, to aid new users. For the first time, MS-DOS was sold in stores to the public (previous versions were only sold to computer vendors who delivered them with their machines) (CIT, 1-3).

The MS-DOS 6 family provided more memory management for applications such as Microsoft Windows. In addition, newer utilities were provided for disk-defragmentation, file compression, file backups and anti-virus checking. Other variations of MS-DOS exist, such as PC-DOS by IBM, DOS-V, Dr. DOS and others. There is even a FREE DOS available on the Internet as an MS-DOS clone. Although it can still be found on many computers, MS-DOS is technically an obsolete operating system, being replaced by Microsoft Windows. For personal computers, MS-DOS is a single user, single tasking operating system.

Single user means only one person uses the computer at a time. Single tasking means that it essentially runs one application program at a time, and has no inherent support for running more than one application program simultaneously (CIT, 2). If we want to look at the basic DOS operating system itself, there is no need to look further than three system files, command. com, Io. sys and (in DOS6. x and earlier) Msdos. sys. These files are crucial in DOS versions up to 6. 22. Io. sys represents the lowest level of the interface and contains the routines necessary for interfacing the OS with the system”s BIOS.

It implements MS-DOS as seen by the hardware and has default drivers for console display and keyboard, printer, serial communications, clock, and a boot disk drive. Msdos. sys handles the higher-level routines such as converting commands from applications into instructions for Io. sys. It implements MS-DOS as seen by application programs. It supports file and record management, memory management, character device input and output, execution of other programs, and access to a real-time clock (CIT, 3). Both of these files are in the root directory, and both are hidden from view by default.

The idea is that you are not suppose to see them, so that you don”t do anything destructive to them (such as deleting them). They are also read-only so that they can”t be deleted accidentally. Command. com is the shell program which interprets user commands, presents the shell prompt, and contains a set of internal commands. The rest of MS-DOS consists of a number of utility programs. Although DOS had cornered the PC market, UNIX was still dominant on the larger workstations. The birth of UNIX in 1969 provided the world with its first modern operating system.

An interactive multi-user operating system, UNIX was initially developed by programmers for their own use. Working for Bell Laboratories, Ken Thompson and Dennis Ritchie created UNIX as an operating system for the PDP-7 computer. Designed as a simplification of an operating system named Multics, UNIX was developed in Assembly language, a primitive computer language specific to one type of machine (Osiris, 1). However, Thompson developed a new programming language “B” which Ritchie enhanced to “C”, and in 1973 this was used to rewrite UNIX which lended the OS portability (Linux Intl. , 1).

The original design philosophy for UNIX was to distribute functionality into small parts, the programs (Theochem, 1). In this way, functionality could be achieved by combining the small parts (programs) in new ways. Moreover, if a new program were to appear, it could be integrated into the system. UNIX was slow to catch on outside of academic institutions but soon was popular with businesses as well. The first five versions were part of an internal research effort of Bell Labs, and it was not until the sixth version, called UNIX Timesharing Sixth Edition V, that UNIX was widely distributed (Osiris, 1).

Relatively recent developments are graphical interfaces (GUI) such as MOTIF, X Windows and Open View. UNIX has two major versions. One, jointly developed by UNIX Systems Laboratories (USL) and by AT&T researchers together with Bell Labs, generically known as System V, is the commercial version and is the most widely distributed by major manufacturers. The second, developed by the University of Berkley and Berkley Software Distribution (BSD), is the educational version and is completely focused on research. The USL version is now on its fourth release, or SVR4, while BSD”s latest version is 4.

However, there are many different versions of UNIX besides these two. The operating system has been licensed to several manufacturers who in turn developed their own versions of UNIX, based on System V or BSD, but adding new characteristics. Most versions of UNIX developed by software companies are derived from one of the two groupings and, recent versions of UNIX actually incorporate features from both of them. However, UNIX has had an unregulated history with over 200 versions (Berson, 16) existing today.

The UNIX system is made up of three primary components, the kernel, the shell, and the utilities (which includes the file system). The central part of the OS, the kernel is the first program to start when the system is turned on and the last program to do anything when the system is halted. In addition to scheduling tasks, it manages data/file access and storage, enforces security mechanisms and performs all hardware access. The name “KERNEL” represents the fact that it is a program designed as a central nucleus, around which other functions of the system were added.

The heart of the operating system, it not only interacts directly with the system”s hardware, but presents each user with a prompt, interprets commands typed by a user, executes user commands and supports a custom environment for each user. The two most common shells are the Bourne shell, default for the System V, and the C-shell used mainly with the BSD version (Osiris, 1). The utilities consist of file management (rm, cat, ls, rmdir, mkdir), user management (passwd, chmod, chgrp), process management (kill, ps) and printing (lp, troff, pr).

Read more

Skype Improvement

In todays devices and services world more and more people choose to use Skype from devices like smartphones and tablet PCs hat rely on battery power, that on average have less processing power than modern desktop computers, and that are not physically or permanently connected to the Internet. This shift in the way people use Skype has required to enhance Skype’s P2P connectivity as well as conserving battery and processing power, while delivering even more of the functionality and reliability that our users expect.

Skype clients will continue to evaluate bandwidth, connectivity and firewall settings to select the most appropriate path for the call and continue to connect devices for P2P calls across the Internet so that users get audio and video connections that ‘Just work. As Skype continue on this Journey, these new technologies are helping to drive improved battery life and improved connections making Skype as a whole more resilient and providing a platform for exciting new features.

Introducing new features Skype have to introduce exciting new features such as video messaging which lets you catch up whenever or wherever the users are, connecting the users with the special people in their life even when their schedules conflict. Skype should invest in ringing new Skype scenarios online, putting the people who matter most to the users Just a click away. For example, in Outlook. com the users are enabled to connect through Skype without leaving your browser.

Improving registration and account security Skype should improve their sign-up, sign-in, and security features. Great new features such as two step verification, will bring additional security to the users. As Skype continue to their new secured communication platforms, such as Xbox, Microsoft account will mean that the users have Just one account to remember and an unlock communications with a growing community of over 700 million Microsoft account users worldwide.

Improving Skype chats Skype users send billions of chat messages every month, and enhancing the performance and quality of this core messaging experience is one of the key improvements the Skype should highlight. The Skype cloud should add the ability to queue and deliver chat messages even if the intended recipient is offline – so you can be ‘always reachable’ to the people who matter most. Skype also should broaden

Read more

Apple’s Strategies Since 1990

Apple Evaluate Apple’s strategies since 1990 and explain why Apple has been through difficult times. What made the “Apple turnaround” possible? After firing Steve Jobs, Apple has been trying to fit into many different markets. They started diversifying into many different areas and ended up with half a dozen products suitable for each area. But this was not what made Apple famous. In 1986 they were seen as a rebellious company trying to be different to IBM and Microsoft. Steve Jobs had the idea no to anticipate the same path as other companies in the computer industry but to create a company which is unique.

Unfortunately the CEO of Apple at that time didn’t share the same perspective and forced Jobs out of the company. Apple was not able to keep up with IBM and Microsoft who had a far greater market share. In the period of 1990-1997 Apple had 3 different CEOs, which is a statement for itself. The era of Sculley, Spindler and Amelio was not a successful one. A company as big as Apple shouldn’t have had the necessity to change the CEO every 2 years. In this period Apple was seen as one of the worst managed companies in the industry.

Apple’s image of being a simplistic company was hurt through different product lines varying only a little in the technical specifications. John Sculley, Apple’s CEO from 1985 to 1993, attempted to gain market share through lower priced products, alliances with IBM and outsourcing most of the manufacturing in order to cut costs. When Spindler became CEO he decided to withdraw all alliances that Sculley has anticipated and started out licensing Apple’s OS to companies who would then be working on Mac clones.

Amelio replaced Spindler due to the flat performance of Apple. Further restructurings were undertaken but unfortunately they all lead to nothing. Probably one of the best decisions that Apple pursued was the acquisition of Next and the return of Steve Jobs. One of the first measures, which were undertaken, is brining back the development in house. Jobs believed that it would be of far more benefit if Apple would develop Software, Hardware and Design all under one roof. The advantage was that everyone had a holistic idea of product development.

This worked out very well and turned out to be one of the competitive advantages. Another important step, which made the ‘turnaround’ possible, was that Apple stopped outsourcing their Operating System. Steve Jobs was back, and Apple was in much greater shape than in any of the years without him. He demonstrated that he has learned from his mistakes through his willingness to co-operate with Microsoft allowing them to develop MS Office software for Macs. The first product, which was released after Steve Jobs’ arrival was the iMac in 1997.

Many people at the company didn’t believe that this was going to be a success, but Steve Jobs proved everyone wrong. The iMac was a huge success and brought some market share back to Apple, but more importantly Apple gained the confidence and got back on the right track. During the development of the iMac, Steve Jobs decided to hang up the pirate flag. In one of his interviews he stated that ‘Apple forgot who Apple was’, and this marked the return of the rebellious company.

Read more

How Helpful Is Microsoft Word, Excel and Powerpoint?

Avery Parsons Man 1030-Week 3 Professor Nathan Riggs Nov. 4,2012 Please define each of these four forms of business ownership and then respond to the following questions: (1) Sole Proprietorship- A business that is owned and usually managed by one person; it is the most common form. (2) Partnership- A legal form of business with two or more owners. (3) Corporation- Is a legal entity with authority to have liability separate from its owners. 4) Franchisors-Owns the overall rights and trademarks of the company and allows its franchises to use these rights and trademarks to do business. The entity or person owning the rights or license of the business. Do you think that Sonic would have grown as large as it did today if had remained a Sole Proprietorship? Why or Why not ? In my opinion , I would say no due to the fact that it would have caused the disadvantage of limit growth, limited reasons and unlimited liability.

In terms of partnership it would bring about conflicts with a partner, division of profits, difficulty termination and unlimited liability. As a corporation , they would have encountered limited liability, the ability to raise more money investments, ease of ownership change and the ease of separating of owner from management. What were the advantages and disadvantages to Sonic each form of business Ownership?

The advantages are they have the ability to have as many Sonic Restaurants all over the world, personal ownership, lower failure rate and management marking assistance compared to someone who begin a business from scratch. The disadvantages are that they have large start-up costs, shared profits, management regulations and coattail effect. Another advantage would be having a unified voice to protect their investments. Sonic has survived and continues to be successful, not only by maintaining a strong fast food presence throughout the years as drive –in.

While other chains have gone under one by one. There have been lots of drive –in and fast food restaurants over time. In your opinion what makes Sonic and other major successful than others? The quality food they serve. The nostalgia of this old time tradition or the novelty of it, depending upon the age of the customers it provides an irresistible attraction to a wide variety of clientele, making sonic potentially a very profitable franchise option. In closing ,Sonic restaurants still are no. through various changes and innovations made rather recently, Sonic Drive in has been able to expand its number of store locations within the past decade. Today there are nearly 3,000 Store locations all across the nation ,and the company boasts market capitalization that exceeds 1. 5 billion. References: Learning activities 1 and 2 Learning video Sonic is Booming Understanding Business McGraw & Hill Chapter 1 pgs. 8-11 Chapter 5 How to form a business-pgs. 114-145

Read more
OUR GIFT TO YOU
15% OFF your first order
Use a coupon FIRST15 and enjoy expert help with any task at the most affordable price.
Claim my 15% OFF Order in Chat
Close

Sometimes it is hard to do all the work on your own

Let us help you get a good grade on your paper. Get professional help and free up your time for more important courses. Let us handle your;

  • Dissertations and Thesis
  • Essays
  • All Assignments

  • Research papers
  • Terms Papers
  • Online Classes
Live ChatWhatsApp