This History of the Autocorrect is as Fascinating as it Sounds

Autocorrect is actually a pretty fascinating little program. In particular, its manner of selecting the proper spelling and usage of a word is surprisingly complex. Autocorrect was originally created out of a system that had previously existed in Microsoft Word as a glossary. This glossary allows for the expansion of existing text, either with an established replacement text or even adding an image. Autocorrect was first established by Dean Hachamovitch, a former vice president of Microsoft. Hachamovitch (thankfully, autocorrect works on his name) realized that the glossary feature could be used to add more functionality to Microsoft Word. The first word which autocorrect targeted was teh, a common mistake that appears all over the place on both the Internet and even in professional environments. Hachamovitch created a script that could automatically fix this error by hitting the left arrow key and F3 simultaneously, but the concept later evolved to make the feature more ?automatic.? Since English is very space-oriented language (we use a lot of spaces in our writing), autocorrect could be triggered upon hitting the spacebar; thus, making it much more convenient for everyone. Afterward, it was simply a matter of figuring out what the most commonly misspelled words were. Some of White Mountain IT Services?s favorite misspellings are as follows: seperate vs separate affect vs effect its vs it?s misspell vs mispell (nothing is more embarrassing than misspelling mispell) their, they?re, and there recommend vs reccommend could of vs could have So, how does autocorrect figure out which words need to be replaced? There are some that Microsoft simply cannot condone being used in autocorrect, like obscenities, which are never suggested or flagged. The replacements for other words are determined according to these factors: Keyboard proximity Phonetics Sentence context So, the words suggested through autocorrect are chosen through the overall context of the sentence it?s being used in. It lends itself to the literary terms prescriptivism and descriptivism; how words should be used, and how they?re actually used in the context of the language. Some words and phrases are often used when they shouldn?t be, or the writer is trying to say something completely different. Autocorrect is responsible for (somewhat) fixing these issues. Despite autocorrect?s somewhat humorous faults at times, it?s an integral part of ensuring your team puts together error-free emails and documents. Technology can do all sorts of great things for your business. Give White Mountain IT Services a call at (603) 889-0800 to learn making technology work for you.

How You Handle Your Mouse Says a lot About You

The purpose of two-factor authentication is to add an extra layer of security when logging into a device or website. Everyone is familiar with using a password to log in to their email, bank account, or social network. Two-factor authentication requires the user to know more than just the password, they have to further prove that they are who they say they are. Typically this is done by entering in a short pin sent to the user via text. According to BioCatch, the way that a PC user uses a mouse can identify who they are. ZDNet explains: The entire way that we use the human-machine interface embedded within each and every modern computer, browser, or website, is like a unique fingerprint. Lefties will operate a mouse differently to right-handed people, for example, and each user ?grabs? an icon at a different point, angle, and so on. BioCatch can analyze the way that users use their mouse to create a profile for them. This profile is then used to determine whether or not you are who your machine says you are. It?s estimated that this method of authentication is capable of identifying and preventing fraudulent logins 80 to 90 percent of the time. There are several variables that are taken into consideration by BioCatch?s user profiles. These variables are put into four layers of properties: Layer One: Standard AuthenticationLayer one consists of the device, network, IP address, hardware, and location – all traits that physically tie you to your PC. These are the typical authentication properties used when logging into an account. The following layers, however, take a much different approach to authentication. Layer Two: Physical ProfileLayer two consists of mainly motion-related actions, such as moving objects around the screen, hand-eye coordination, and the mouse pointer (or finger on touch screen devices). Layer Three: Cognitive ProfileLayer three consists of examining mental abilities, such as response time and connection time. It also looks for suspicious activity that is out of the norm. One example used by ZDnet is online banking – normally, a user would check their balance before doing anything. If a money transfer is their top priority, something might be up. Layer Four: Invisible ChallengesBioCatch?s final layer of protection is meant to authenticate a user?s identity, but not in the traditional sense. BioCatch purposely puts problems in the way of the user in order to determine who they are. Everyone reacts to potential threats differently, and their response can be used to verify one?s identity. Despite whether or not BioCatch?s ideas become mainstream, there?s one thing we know is certain: you need to keep yourself protected from hackers with more than just a password. Two-factor authentication might have its flaws, but it?s a better protection measure than others, especially considering how weak passwords have been recently in light of powerful, sophisticated malware. White Mountain IT Services can equip you with two-factor authentication methods, like SMS messages via a cell phone. Call us at (603) 889-0800 today to see what we can do for your business.

Tip of the Week: 52 Minutes On and 17 Off Maximizes Your Productivity Potential

That might sound a bit strange, but according to Gifford, this is true. The Draugiem Group recently performed an experiment which targeted their most successful, productive employees. They used the tracking application DeskTime to see how these team members spent their time. Ultimately, it wound down to how much time the team spent on break compared to how much time was spent working. The 52-17 RuleThe ratio, according to the study, turned out to be 17 minutes of rest in between 52-minute work intervals. This might sound similar to the Pomodoro technique, where you work in 25-minute intervals with five-minute breaks in between, and a longer 15-minute break after the fourth increment. The difference here is that you work hard and fast for 52 minutes, then take a 17-minute break before another burst of work. Rest is ImportantNot surprisingly, several other researchers and business professionals have reached a similar conclusion. Entrepreneur Chris Winfield claims that he was able to cut his work week in half thanks to the sheer power of time-management and regular breaks. He used the Pomodoro technique to discover just how much work he could squeeze out of his work week. It took plenty trial and error for him to find the right increments he could fit in due to meetings and things beyond his control, but in the end, he was able to cut a 40-hour work week down to 16.7-hours. And, the best part of the entire situation was that he was still getting the same amount of work done, even with all of the breaks. The Result: Much More FlexibilityThanks to getting much more work done in a shorter amount of time, Winfield was able to introduce more flexibility into his work schedule. Of course, the only way to get so much done in a short amount of time is prioritizing tasks in a way which allows you to tackle the most important objectives with maximum efficiency. Winfield also states that the number of days he worked every week also contributed to his success: The final piece to my puzzle was moving from a five-day work week, where I had to stop by 5 p.m., to a seven-day work-week, where I could work when it suited me. This took me from 40 to 45 hours available to get my 40 Pomodoros in, to having 168 hours each week. Since I only need 16.7 hours net, that means I only work 10% of my time. What a difference. He doesn?t count the 20 to 25 hours of meetings and calls toward that time, but he still managed to save himself some sanity by taking small breaks in between big projects and work increments. If you aren?t quite sold on these work-break-work techniques, there are other ways you can boost your productivity. White Mountain IT Services can help your business take advantage of productivity-increasing technology. Give us a call at (603) 889-0800 to see how we can cut the time you spend working in half.

Humans and Robots: An Uneasy Trust

Case in point, an incident occurred this month in Germany at a Volkswagen plant, where, tragically, a robot discrepancy left a 22-year-old worker dead. According to The Guardian, the man was part of an assembly team that was setting up the robot. The robot responsible for his death ordinarily works within a confined area by moving auto parts and manipulating them to meet a certain goal. Except this time, instead of an auto part, the robot grabbed the contractor, causing a critical injury. Supposedly, human error is to blame for this issue rather than the robot itself. In another example of robots causing potential harm to users, consider the self driving car program from Google. Currently in the testing phase, driverless cars have taken to the roads in California. So far, the headlines have been dominated by these cars getting in one fender bender after another. Thankfully, no one has been hurt from these driverless cars like what happened in Germany. Although, if these errors aren?t fixed and Google?s driverless cars end up on every highway, well, you can imagine what a disaster that would be. Incidents like this lead us to ask questions like, ?Should humans blindly trust technology to perform to their optimal specifications?? and, ?Is your business ready for the day user error causes a critical loss?? While it?s unlikely that a workstation or a server will injury one of your employees, the machines can certainly be responsible for the death of another precious entity: your data. User error is a primary cause for data loss, and it?s easy to understand why. For example, if you?re not thorough with your training and security policies, unwary users might accidentally leak information outside of the network using a mobile device. Furthermore, if a user doesn?t adequately save their progress on a project, it can be lost in the blink of an eye, making the time spent up until that point obsolete. Despite the fact that technology can be fairly unpredictable, human error is just as dangerous as hardware failure or any other technology-related issues. Computers are operated by humans, and are, therefore, susceptible to the imperfections of the humans that operate them. It?s important to keep in mind that your users are human, and therefore, even the most cautious person can make mistakes leading to data loss (or worse). Humans and machines have an uneasy relationship; technology isn?t always reliable and user error is prone to causing major problems. What does the future hold for this human/robot dynamic? Do you believe there will be a Terminator-like future will humans are at war with self-aware robots? Or do you think improving and perfecting machines is the key to a technology-filled utopia? Share with us your thoughts in the comments.

Your Backup and Disaster Recovery Solution Should Be Easy as 3, 2, 1

Have you ever heard of the 3-2-1 process for deciding on a backup solution? It?s very simple, and easy to remember in a pinch. According to InfoWorld, you want three copies of your organization?s data. Two of these should be in different formats, and one should be stored off-site for extra data redundancy. If your business is one of the many who still run tape, you?ll be relieved to hear that there?s an incredibly more efficient way of keeping your data as up-to-date as possible. What we?re talking about is taking advantage of a cloud backup solution to safely and securely transfer your data to a data center. Unlike tape backup, a cloud backup solution allows your business to take multiple snapshots of your data daily, allowing for comprehensive and redundant data retention. Furthermore, you can easily recover data from your cloud solution, making it ideal not only for storage purposes, but also for recovery. The only issue that many businesses get hung up on is the technical details; not to mention the pricing of a complete and total overhaul of their entire data backup process. Many small and medium-sized businesses find it difficult to afford such an in-depth procedure, or they simply don?t have the manpower available to integrate a comprehensive backup solution. In a world where hackers can target your data and steal it or render it useless, you need to take any and all precautions to prepare for the worst. To save SMBs both time and money, White Mountain IT Services offers managed IT services that help optimize operations and increase productivity in the workplace. One of our most important solutions, especially to a business that can?t afford to lose everything in an instant, is our Backup and Disaster Recovery (BDR) solution. It?s designed to take multiple backups throughout your workday, and send copies to an off-site data center for storage and later recovery. Arguably, the most valuable asset that our BDR offers is its ability to practically neutralize downtime. When you lose access to your data due to a natural disaster, power outage, or other potentially disastrous downtime incident, the BDR device can rapidly act as your server, deploying your backed-up data and allowing your business to function even under the worst conditions. This allows you to make the proper arrangements for replacement technology if your server is inoperable or insecure. To make data backup easy as 1-2-3, give us a call at (603) 889-0800.