Steps to Repair the Microsoft Office 1402 Error Code

The 1402 error normally appears when you're trying to install / change the Microsoft Office suite on your PC. The error is caused when the installation utility is unable to change / edit a set of registry keys on your system, preventing it from being able to install the files it needs. Fortunately, you can fix this error by using some simple steps, which are outlined in this article.

The 1402 error normally appears like this:

Error 1402: Could not open key

You can fix the 1402 error on your PC by first running a "registry check", if you're using Windows 98 or 2000. A registry check is a built-in facility for Windows that looks through the registry of your PC and ensures all the elements of it are correct & functioning properly. By using this feature of Windows, you should be able to see if any of the registry settings of your system are damaged or corrupted. If they are, it will inform you on how to fix them.

If you are using a later version of Windows, or do not find any registry errors, then the next thing to do is to try and reinstall the Office program with a different user profile. It's often the case that a user profile which does not have sufficient privileges to install a program like Office, making it important that you're able to use a user profile that does. If you have a system administrator, it's recommended you get the administrator to install office for you.

After that, you should use a 'registry cleaner' program to fix any of the errors that could be causing the 1402 errors on your system. Registry cleaners are very useful tools, as they manage to fix the various errors that are inside the "registry" of your PC, allowing you to boost the speed and reliability of your system with the click of a button. More specifically, they are able to scan through your computer and repair the damaged settings that are inside the registry. This means that if you have any corrupt / damaged registry settings inside your system, this cleaner should fix them for you. The 1402 error is often caused by registry errors, making it highly recommended you use a registry cleaner to fix them. …

Why Does My Computer Run Slow?

A slow PC is definitely an issue which needs an immediate resolution. There are many reasons that are responsible for the slowed speed of your PC. If you too are a victim of such computer problems, here are a few tips that may help you in dealing better with your PC.

The first tip is to figuring out of the reason, which is decreasing speed of your PC. Some of the most common reasons of a slow computer are –

  • Less space availability in your Hard Disk
  • Presence of Computer Virus
  • Installation of unnecessary software on your system
  • Corrupted data
  • Missing Windows Updates and Outdated Drivers
  • Poor hardware support

Resolution

  • Verify that your PC has a minimum of 1 GB – 2 GB free hard disk space. You can determine this by checking disk-space and then deleting unnecessary items from your PC. This space is very important for a speedy computer, as it allows the system to have room for temporary files.
  • You can also run Scandisk or a similar tool to ensure that nothing is physically wrong with your computer's hard disk drive.
  • If data is not stored in proper manner than also you computer may get slowed. To avoid such situation, you can run 'Defrag.' This would arrange the data in your PC in best possible order. Thus, speed of your PC would be automatically enhanced.
  • Computer Virus is a major reason that slows down the speed of PC, remove virus from computer. You can install good virus removal tool (antivirus software) to detect malicious items hidden in your PC and then remove them.
  • Low RAM capacity is also a reason behind poor PC speed. To get rid of such issues you can consider upgrading of your RAM.
  • Sometimes, missing Windows Updates also makes PC slow. To stay away from issues alike, tech support experts advise to go online and update the operating system.
  • Start in safe mode: This is considered an important step to enhance speed of your PC.

Step to start computer in safe mode is written below-

  • Click Start, then click Shut Down.
  • Click Restart and then click OK.
  • Press F8, prior to your computer launches Restart.
  • Now Select Safe Mode and press ENTER

If you still are not getting the reason of your slowed PC, you may consult a professional tech support service provider. Internet is complete with information on quality on call / online tech support provider who not only finds issues of you computer but also provide quality support for that. …

Information Technology Schools in Atlanta

The demand for technically skilled workers is growing rapidly. According to the US Bureau of Labor Statistics, Information Technology is the fastest growing industry with a 68% increase, in output growth rate, projected between 2002 and 2012.

Finding an Information Technology School or university can be an overwhelming task with so much on offer.
For a qualification to have any value, it needs to be obtained from a properly accredited and state licensed Technical School or College. Once the credentials are in place, other factors such as technical courses on offer, financial aid, flexibility of courses and other unique features can become qualifying factors.

This article will look at the unique and flexible features of some of the schools in Atlanta, offering Information Technology courses.

Brown Mackie College has a unique feature in offering one month courses where the focus is on one subject per month. This allows for very flexible study time to be integrated with a job or other time demand factors. There is no need to wait for “fall” enrollment as courses continue each month. This Information Technology School is located within minutes from downtown Atlanta. A shuttle bus from the train station takes students to the campus. Technical programs focus mainly on Computer Technology with courses in Information Technology, Database Technology, Networking and Software Technology.

De Very University comes highly recommended and offers a variety of technical courses in their College of Engineering and Information Science as well as the College of Media Arts and Technology. The university is unique in allowing students to take courses at any of the other Atlanta campuses. Students are further allowed to do part of their studies online. More than 85% of De Vry students receive financial aid in one or other form. A 90% career success rate and a dedicated career service team ensure that students at this school will get the help they need to become successful in the demanding job market.

At Westwood College working professionals, in their specific technical fields, are employed as instructors. The aim is to not only tutor but for students to get hands on technical experience while learning. Classes can be taken during the day, evening or online. A unique and valuable feature of this College is their alumni tuition program. To keep up to date with the rapidly changing world of technology they offer free courses to their alumni students. Definitely something to consider for years to come.

The Art Institute of Atlanta is a Technical School that uniquely caters for the creatively inclined. Media Arts focus on the artistic side of Computer Technology. Film making, Animation, Game and Web Design are only a few of the courses they have to offer. This school makes it possible for those students that can not study full time to achieve their goals by enrolling in an evening or weekend program.

All these schools have unique and versatile features to cater for the needs of all prospective students and the growing Information Technology market.…

Is It Better to Add or Subtract EQ When Mixing Music?

Many home studio producers and studio recording engineers who know how to mix music have learned to equalise their music over time, will know that EQ is a crucial tool for creating a balanced mix that represents the music best and is pleasing to the ear. And the listener will want to listen to time and again. So getting the balance right and knowing when to subtract or add equalization is vital. And EQ is often overused.

Equalization or EQ is basically the volume of frequency ranges within a piece of music. Starting with sub bass and bass as the low frequency ranges, lo-mid hi-mid and presence range are next up and then come the high frequencies, some of which we cannot hear. Human ears are particularly sensitive to the mid and hi-mid or presence range as this is where speech lies in the frequencies.

If music is harsh or tiring on the ears then this means the mid range has been accentuated too much.

Another common error is too much bass which leads to ‘woolly’ or muddy mixes. So it is a delicate balance to strike to get all the elements of a track, song or piece of music heard as you intend when you are mastering how to mix music.

A common strategy is to do additive equalization (EQ) to bring out come frequencies or sounds within an instrument or track. This effectively means you are turning up the volume of a part of the instruments sound or overall within the piece of music. You can see the effects of this on some stereo hi-fi systems with EQ settings or even in computer software such as Winamp and Windows Media Player. So if adding EQ in music is essentially increasing the volume of a frequency range, it is important to be aware that this will affect the balance of the harmonics or overtones of the sound. And apparently with many plug in software EQ devices, this can also result in distortion or phasing issues (where the frequencies can start to cancel each other out or negatively affect each other in other weird and wonderful ways).

And subtractive EQ in music – where you basically are taking out some frequencies or turning them down – can also cause phasing issues. But I tend to try to go for subtractive instead of additive where possible as I feel is less harsh or tiring for the ears and can leave room for other instruments or frequencies to breathe. For example, if a vocal track is not cutting through the guitar in a singer songwriter song, I will first try carving out some of the mid frequencies from the guitar rather than adding more mid range frequencies to the vocal track.

So, before you reach to turn up frequencies to get them to be more present in the mix or more prominent, perhaps try taking away some frequencies from the parts that are overshadowing or in conflict with those frequencies. Of course …

Is Your PC Running Slow Or With Errors?

Has your computer been having problems in the sense that it slows down with errors that keep on appearing constantly? There is no need to worry when this happens. There is this belief that when your computer starts slowing down, the most probable cause is an attack by viruses. So what do many people end up doing? They purchase the best anti-virus software programs from the stores so that they can get rid of them. We find that the anti-virus works for a short period of time then the machine starts slowing down again.

Have you ever wondered if there is another computer problem that can cause the PC to slow down? There are other many things that can result to this. Did you know that there is a common problem that occurs to almost all the machines? The problem is the corrupt registry. There are some common issues caused by a corrupt or even a damaged registry. The issues range from a repeated occurrence of error messages, the severe slowing down of start-up speed as well as speed of operation, recurrent stalling and sluggishness of the system, persistent application errors and crashes and finally periodic accessibility to start crashes. All these problems are associated with the registry.

It can best be described as the yellow pages of your computer, which are responsible for storing information and also settings for all different applications of the PC. All the activities that are performed to the machine, inclusive of opening and reading, are done by the help of this registry. The problem with it is that most of the times that it is prone to attacks and damage that renders it corrupt, leading it to run slower and with a lot more problems that you can expect. The only thing that can be done with to fix it involves using a registry cleaner.

There are special tools that have been created to deal with any problem that arises from it, and these cleaners can be downloaded or in other instances bought from stores, after which a computer user is advised to install them into the machine and run the cleaner. Most of them are effective and leave your machine running very fast making sure that there are no errors which are displayed. So, you need to take note that when you notice the machine slowing down, the first step does not involve installing an antivirus software, but checking to clean the registry. …

Constructionism, Logo, and Seymour Papert

Seymour Papert – Logo

In the mid 1960s Seymour Papert, a mathematician who had been working with Piaget in Geneva, came to the United States where he co-founded the MIT Artificial Intelligence Laboratory with Marvin Minsky. Papert worked with the team from Bolt, Beranek and Newman, led by Wallace Feurzeig that created the first version of Logo in 1967.

The Logo Foundation

'Logo is the name for a philosophy of education and a continuously evolving family of programming languages ​​that aid in its realization.' Harold Abelson – Apple Logo, 1982. This philosophy is based on Constructivism (a learning theory). The Logo Programming Language, a dialect of Lisp, was designed as a tool for learning. Its features – modularity, extensibility, interactivity, and flexibility follow from this goal. It is used to develop simulations, and to create multimedia presentations. Logo is designed to have a "low threshold and no ceiling": It is accessible to novices, including young children, and also supports complex explorations and sophisticated projects by experienced users. The most popular Logo environments have involved the Turtle, originally a robotic creature that sat on the floor and could be directed to move around by typing commands at the computer. Soon the Turtle migrated to the computer graphics screen where it is used to draw shapes, designs, and pictures.

Further Information

Alan Kay and Seymour Papert envisioned in the 1960's the computer's role as a tool for the mind an 'idea processor'. They have worked at bringing computers into this role for adults and children through Croquet, and some of Croquet's predecessors like the Logo language and environment by Papert, and Squeak, the open source Smalltalk language and environment, by Kay. Squeak and Croquet have developed from the early work in Smalltalk and provided a tool for end user programming, collaboration, visualization, and simulation.

Constructionism

The work of Seymour Papert demonstrates the approach of constructionism (Papert and Harel, 1991) (Resnick, 1996). The Constructionism idea is based on the constructivist theories of Piaget. To this theory constructionism "adds the idea that people construct new knowledge with particular effectiveness when they are engaged in constructing personally-meaningful products" (Resnick, 1996). Resnick goes on to say "This vision puts construction (not information) at the center of the analysis. It views computer networks not as a channel for information distribution, but primarily as a new medium for construction, providing new ways for students to learn through construction activities by embedding the activities within a community. " Resnick explains the theory known as Distributed Constructionism. This involves a community gaining an understanding of a problem by interacting with a knowledge building community, the problem to be modeled, and tools to model the problem, and build a solution. An example that Resnick cites is the work of Kimberly (1995) where participants became part of the simulation that they constructed in order to understand economic models. The idea of ​​constructionism is related to end user programming, and ontology modeling, and building. Resnick explains his use of interactive web based knowledge …

What is Bluetooth Technology?

Bluetooth technology is a type of the wireless technology that eliminates the need for the number of inconvenient cables and devices that are used to connect the computers, mobile phones, digital cameras, handheld devices and new digital appliances. Bluetooth enables the users to connect to a wide variety of telecommunication and computing devices easily, without cables.

It makes rapid ad hoc connections, automatic unconscious connections between two or more digital devices. Bluetooth provides the opportunity of using the mobile data in different applications. Bluetooth makes wireless communication between the two devices in a localized area of a room of office or home very easily. Bluetooth technology uses radio-based links and all the connections between the devices and invisible and instantaneous.

By Bluetooth technology your laptop can send print request to a printer in your next room. Bluetooth is actually a standard for wireless communication between the devices in a relatively small area and it is therefore works fine in the personal area network (pan) using radio frequency.

Any two devices that follow the Bluetooth standard can communicate with each other. A number of the Bluetooth devices like digital camera, mobile phone and handheld pc can form a network. You can send emails to your mobile phones from your laptop without any physical connect between your laptop and your mobile phones.

Features of Bluetooth technology

o Bluetooth technology uses radio waves for communication in 2.4 GHz

o It supports multi point communication not just point to point.

o Bluetooth works in a small area of 10-15 meters.

o Bluetooth offers speed of 1-2 mbps.

o Bluetooth chipsets are less expensive though more expensive than IrDA.

How Bluetooth technology works

Bluetooth is a high speed wireless link technology that uses the radio waves. It is designed to connect the mobile phones, laptops, hand held devices and portable equipments with almost no work by the end users. Unlike infrared Bluetooth does not require line of sight between the connecting units. Bluetooth technology is a modified form of the current wireless LAN technology and it’s more acceptable for its relative small size and low cost.

The current circuits are contained on a circuit board of 0.9 cm square and a much smaller single chip version is in development and soon it will be in use. The cost of the Bluetooth device is expected to fall rapidly. Bluetooth chip has to be equipped in many devices. In Bluetooth technology, small and inexpensive transceivers have been placed in the digital devices. The radio waves operate at 2.45 GHZ band on the Bluetooth devices. Bluetooth supports the data speed up to 721 Kbps and 3 voice channels. The Bluetooth chip can either be built into the devices or it can be uses as an adapter. In computer it can be used with the USB port. Each Bluetooth device has a 48 bit address from the IEEE 802 standards and the Bluetooth connections can be either point to point or multi point. Bluetooth range is 10 meter but …

The History of the Car Computer

The car is a very complicated thing in the modern world, with a whole host of mechanical and electronic systems working together to keep the car running, and to keep its levels of operation at maximum efficiency. The engine control unit is the centerpiece of a car’s electronics, which makes millions of processes each second to make slight adjustments to the actuators depending on the information the central CPU gets from the various number of real-time sensors. This goes alongside the transmission control unit, which ensures that gear changes within automatic cars are most efficient. These car computers not only keep the car running, but minimize the amount of fuel wasted, which keeps efficiency and economy high, whilst helping to protect the environment with minimal emissions.

The cars of today present a marked contrast between now and the early days of the automobile. From the turn of the century when the first commercial automobiles emerged, to the end of the 1960’s, there were obviously no electronic components, and vehicles were designed simply and included simple and robust mechanical control parts and basic methods of control. Back then a better car meant a car with a bigger engine, more speed and more horsepower, and little heed was paid to efficiency, economy, and the environment. However, the issue of the environment, and certainly the issue of economy became more and more pronounced in the 1970’s, with the inclusion of mandates, and the notable fuel crisis of the mid 70’s.

Around the same time, electronic technology was reaching the point where it was physically able to be included in automobiles, alongside the transmission from carburetors to fuel injection, but is wasn’t until the 1980’s that electronics became practical and economic enough to be included. Control over the ignition in the interests of minimizing fuel usage drove car electronics. The first pieces of circuitry used to control spark timings where large pieces of solid state circuitry, and would need replacement every few years. By the middle of the 80’s, the industry would be founded on fuel injection completely controlled by electronics.

Naturally, as commercial electronics boomed in the 80’s and on through the 90’s, becoming smaller, cheaper and more sophisticated, on-board car computers would take on more and more functional responsibilities, sensing more and more data and controlling more and more aspects of the car engine, among other things such as braking and climate control. Indeed it was not long before the computer became the central and integral component in the car.

With the rise of the computer came the potential for customization, with access to a programmable computer providing immense control other a vehicle’s power and other variables.…

How To Build Your Next Web Site In A Few Hours

I've heard lots of woes from people trying to work with their Web site consultants this week. You know the type: they promise that your site is "just about finished" and the pages "just need some tweaking" and yet nothing gets done. I have had to suffer through whiney rants about delays, bad programming decisions, tools that malfunction, missing logins and content wrecks.

Have we reached the point where building a web site is a lot like building a new freeway? It takes far too many people, time, and dollars, upsets the people who have to live near it, and in the end is obsolete by the time the first people try to use it.

I remember the good ole days of the Web, say 12 years ago, when one person (like me) could build a site in an afternoon, without any really specialized tools or knowledge beyond knowing a few tags and reading a Laura Lemay book.

I am coming to the conclusion that we need to return to those simple days where one person can still build their site, without the heavy lifting of a Web Site Designer and a Web Programming Consultant and an Internet Search Specialist and a Web Marketing Person. (Capital letters deliberately intended to reflect the title's self-importance.)

At one site, a simple database was taking months to webify. I ended up talking to the site's graphic designer, who was the only one who had any project management skills and could reign in the wayward development staff. Said staff has trouble configuring something that my high school networking students could do in their sleep. Someone else was complaining to me that their copy of Dreamweaver had started behaving badly, and all I could do was recommend a clean uninstall of every Adobe product on her disk, short of buying a new computer. These are just a couple of the stories I could tell you this week alone.

So in the 15 or so years of the Web we have better tools, but they still suck. Better sites, but they are still annoying with pop-ups and dead-end links and overblown graphic frippery. Better site statistics, but still no insights into who comes where and why they leave our sites. Better traffic, but still a lot of mythology about how the search engines point our way. And speaking of search, why is it that we still can not do better there on deploying good internal site search algorithms?

There is a simple answer: rebel, resist, and reclaim the Web as your own personal place. Avoid the consultantization of the Web. Fire your designers and programmers.

Start afresh with a blogging tool like WordPress or Blogger and build your site around that. Or pick up a couple of widgets and components, or use dabbleDB or Pageflakes or stuff from Google or Yahoo. You do not need a passel of programmers to work this Web.

Since moving over to WordPress and posting these simultaneous to the blog and …

Power and Your PC

The power coming into a computer is the most critical component, and it may be one of the most overlooked. It is just taken for granted that it will always be there and working properly. A top of the line processor and ultra powerful video card do nothing if a system does not receive the ample, stable power it demands. Having quality components providing and regulating the power supplied to a computer is critical, and this brief overview looks at a few areas worth consideration.

Power Supplies

Computer power supplies take the high (110V or 220V) DC voltages from an electrical outlet and convert it to the various lower AC voltages required for a system to operate. The typical voltages required inside a computer are 3.3V, 5V, and 12V, where the 3.3V and 5V lines are generally used to power circuitry, and the 12V line provides power to run items such as hard drives, optical drives, and cooling fans.

Power supplies are sold in terms of their total power output, in terms of wattage. Choosing the correct power supply means not only finding one that will provide enough power for all of the components connected to the system, but also one that is the correct size physically, has enough connections for typical drives and fans, and if necessary, that also has special connections for things such as Serial ATA drives and modern video cards.

Choosing a power supply with enough power shouldn’t be much of a problem, as having more power than you need is never a bad thing. Just because a power supply is rated for 400W, or perhaps 600W, does not imply that it is drawing that at any given time. It just indicates the total power available to the various lines inside the computer. For those interested in getting a good idea of their minimum power requirements, this Power Wattage Calculator is a convenient reference. In addition to checking out the total wattage of a power supply, looking for strong amperage ratings on the 3.3V, 5V, and 12V lines is also recommended, as power supplies with identical total power ratings may distribute the power to the various lines in different quantities.

Power supplies come in a few different physical sizes, but the most common are designed to fit the standard ATX and micro ATX (mATX) form factor cases. A typical ATX power supply, such as this Echo-Star 680W unit, measures 3.25″ x 6″ x 5.5″ and features two cooling fans to not only cool the power supply, but to also help draw hot air out of the computer. A typical mATX power supply, such as this A-Power 320W unit, measures 2.5″ x 5″ x 4″ and due to the smaller size features just one cooling fan. mATX cases are generally much smaller than ATX cases, and therefore have smaller power supplies, with generally lower power ratings, and fewer connectors.

The connectors on a power supply also deserve consideration. Most power supplies come with what looks to be …