Computer Lingo!

Computer lingo!

Or, “techno-babble” as I like to call it. If you don’t know your IEEE 1394s from your PCI Express x16s, then you may have a problem if you plan on building or upgrading hardware in your computer. Knowing the talk is essential for getting parts that are compatible and not blowing up your house with parts that conflict (just kidding, you’d blow up your whole neighborhood) (just kidding again, you’d just have to return the part and lower your head in shame for not reading this article).

Here’s some techno-babble that is useful for building or upgrading a computer:

Mobo: Motherboard – the circuit board that all of the computer components are attached to.

AGP: Accelerated Graphics Port – High-speed port on a motherboard for attaching a video card (not the standard, PCI Express x16 is today’s standard)

ATX: Advanced Technology Extended … Read More

Large-Scale Data Processing Frameworks – What Is Apache Spark?

Apache Spark is the latest data processing framework from open source. It is a large-scale data processing engine that will most likely replace Hadoop’s MapReduce. Apache Spark and Scala are inseparable terms in the sense that the easiest way to begin using Spark is via the Scala shell. But it also offers support for Java and python. The framework was produced in UC Berkeley’s AMP Lab in 2009. So far there is a big group of four hundred developers from more than fifty companies building on Spark. It is clearly a huge investment.

A brief description

Apache Spark is a general use cluster computing framework that is also very quick and able to produce very high APIs. In memory, the system executes programs up to 100 times quicker than Hadoop’s MapReduce. On disk, it runs 10 times quicker than MapReduce. Spark comes with many sample programs written in Java, Python … Read More