Blog

September 28th, 2009

sacramento-bj

SACRAMENTO, Calif. – GNT Solutions has been named one of the Sacramento Region’s Fastest Growing Companies by the Sacramento Business Journal.

In September, the Sacramento Business Journal announced its 2008 Fastest Growing Company list makers. GNT Solutions was ranked 14th on the list with a significant growth of 163% over the past three years.

Topic News
January 13th, 2017

img-windows10-170px-op2 Windows 10 had a rough year. It’s been flooded with numerous complaints from users claiming that they were "tricked" into upgrading Windows 7 and uncontrollable OS updates. The new year is a time for new things, and that’s exactly what Microsoft is planning to do in 2017. It’s been speculated that Microsoft plans on addressing certain issues, but there are still many that are left unattended to.

A patch and a glitch away Users couldn’t escape the glitches in the frequent Windows 10 updates, which caused an array of problems such as frozen systems, broken webcams, and even PCs being unable to secure an Internet connection. In light of the patching dilemma, Microsoft is offering more options to defer updates. In fact, a leaked preview shows a new option to pause updates for up to 35 days via a switch in the Settings menu.

OneDrive placeholders Since the launch of Windows 10, many users have eagerly awaited Microsoft to re-introduce this beloved feature to the operating system’s built-in OneDrive cloud storage service. In Windows 8.1, placeholders (aka Smart files) allowed users to see all their OneDrive files, whether or not they were stored on the device. Making its return in Windows 10 File Explorer when using OneDrive, the feature shows user files stored locally as well as on the cloud.

Owning up to the update fiasco Not only is Microsoft addressing the various complaints it received, but it’s also owning up to some of them. Just before Christmas, Microsoft’s chief marketing officer, Chris Capossela, admitted that the company had gone too far when it tried to get Windows 7 and 8 users to upgrade to Windows 10. This referred to Microsoft’s decision in early 2016 to change the design for the user prompt for its Get Windows 10 app, the software responsible for scheduling upgrades. The user prompt was altered so that clicking X to close the window causes the user to unknowingly agree to a Windows 10 upgrade. This change puts Microsoft in direct violation of its own user experience guidelines for developers on dialog box design.

“Within a couple of hours of that hitting the world, we knew we had gone too far,” recalled Capossela. “Those two weeks were pretty painful and clearly a lowlight for us.” It was then that Microsoft reversed its decision on tweaking the pop-up, so clicking on X would dismiss the upgrade.

It seems that 2017 is the year that Microsoft will start listening more to its users and addressing their complaints. Maybe: Hopefully this renewed strategy will benefit users sooner rather than frustrate them later. Contact us and we’ll keep you up-to-date with the latest Microsoft updates.

Published with permission from TechAdvisory.org. Source.

Topic windows
January 12th, 2017

2017january12_security_bRansomware, Trojan horses, spyware and malware are things small businesses like yours don’t ever want to come across. While the term cyber security was once thrown around to scare businesses into purchasing security software, today’s sophisticated threats can leave immense impact, something that antivirus solution alone can’t handle. With that in mind, we’ve rounded up top cyber attack statistics that prove you need managed services in order to remain safe and operational.

The numbers

Small businesses are not at risk of being attacked, but worse, they’ve already fallen victim to cyber threats. According to Small Business Trends, 55 percent of survey respondents say their companies have experienced cyber attack sometime between 2015 and 2016. Not only that, 50 percent reported they have experienced data breaches with customer and employee information during that time, too. The aftermath of these incidents? These companies spent an average of $879,582 to fix the damages done to their IT assets and recover their data. To make matters worse, disruption to their daily operations cost an average of $955,429.

The attacks

So what types of attack did these businesses experience? The order from most to least common are as follows: Web-based attacks, phishing, general malware, SQL injection, stolen devices, denial of services, advanced malware, malicious insider, cross-site scripting, ransomware and others.

Why managed services?

Managed services is the most effective prevention and protection from these malicious threats. They include a full range of proactive IT support that focuses on advanced security such as around the clock monitoring, data encryption and backup, real-time threat prevention and elimination, network and firewall protection and more.

Not only that, but because managed services are designed to identify weak spots in your IT infrastructure and fix them, you’ll enjoy other benefits including faster network performance, business continuity and disaster recovery as well as minimal downtime. One of the best things about managed services is the fact that you get a dedicated team of IT professionals ready to assist with any technology problems you might have. This is much more effective and budget-friendly than having an in-house personnel handling all your IT issues.

Being proactive when it comes to cyber security is the only way to protect what you’ve worked hard to built. If you’d like to know more about how managed services can benefit your business, just give us a call, we’re sure we can help.

Published with permission from TechAdvisory.org. Source.

Topic Security
January 11th, 2017

2017january11_business_bOver the previous months, you’ve probably heard about new and disruptive trends like virtual assistants, smartphones, and automation technologies. Some of these IT solutions may even be placed on top of your business priority list. However, with floods, fires, and power outages just around the corner, disaster recovery and business continuity plans should always have a place on your annual budget.

DR isn’t a huge investment A common misconception about disaster recovery is that it’s a large, bank-breaking investment. Expensive secondary data centers, networks, and server maintenance usually come into mind when a business owner is confronted with the idea of business continuity. And while that may have been true in the past, establishing a strong disaster recovery plan today is as simple -- and as cheap -- as going to a cloud-based disaster recovery provider and paying for the data and services that your business needs. Subscription pricing models are actually incredibly low, meaning you can have minimal downtime while still having enough to invest in new tech.

Onsite backups just won’t cut it Although you might feel secure with a manual backup server down the hall, it is still susceptible to local disasters and, ultimately, does very little in minimizing company downtime. When disaster recovery solutions are hosted in the cloud or in a virtualized server, restoring critical data and applications only take a few minutes.

Business disasters can be man-made, too Even if your workplace is nowhere near frequent disaster zones, cyber attacks and negligent employees can leave the same impact on your business as any natural disaster can. Setting a weak password, clicking on a suspicious link, or connecting to unsecured channels is enough to shut down a 5-, 10-, or even 50-year-old business in mere minutes.

Sure, installing adequate network security is a critical strategy against malicious actors, but last year’s barrage of data breaches suggests that having a Plan B is a must. A suitable disaster recovery plan ensures that your data’s integrity is intact and your business can keep going, no matter the malware, worm, or denial-of-service attack.

Downtime will cost you A business without a DR plan might come out unscathed after a brief power outage, but why risk the potential damages? Either way, downtime will cost your business. First, there’s the general loss of productivity. Every time your employees aren’t connected to the network, money goes down the drain. Then there’s the cost of corrupted company data, damaged hardware, and the inevitable customer backlash. Add all those variables together, and you end up with a business-crippling fee.

So, if you want 2017 to be the best year for your business, make the smart choice and proactively take part in creating your company’s business continuity plan. Your business will be in a better position financially with it than without it.

Keep your business safe, recover from any disaster, and contact us today.

Published with permission from TechAdvisory.org. Source.

Topic business
January 10th, 2017

2017january10_socialmedia_bWith communication apps such as Skype, WhatsApp, and Slack aiming to conquer audio and video calling, what is a social networking giant like Facebook to do? In a move to enhance its already ubiquitous Messaging app, Facebook is all set to follow suite and occupy a space alongside some of the most dominant VoIP apps in the market, particularly in desktop group video calling.

No one gets left behind

In group chats, there’s always that one person who gets the joke last, or reads it last, and so feels left out. With Facebook’s group video chat, this never has to happen as every participant in the group can be connected at the same time. The functionality is still in the “small test” phase, which means certain details are still being ironed out. When it fully launches, though, expect a considerable portion of Facebook Messenger users to consider moving their Skype group video conversations to Facebook.

Potentially compelling benefits

Since introducing its audio calling capability in 2013, Facebook has worked hard to keep up with the competition. Facebook’s introduction – and potential domination – of desktop group video calling might signal the end of times for Skype and other players. For one, Facebook and social media user growth hasn’t shown signs of slowing down. Moreover, most users of internet telephony might inevitably see the advantages of using a single platform for their social media activity and online communications.

Possible user reservations

Users who prefer a communication tool that creates zero distractions in their chats might not opt for Facebook when conducting group video conferences. Facebook is, first and foremost, a social media platform, which serves as a springboard for important news updates, personal anecdotes, and funny cat videos. Not everyone will prefer all those distractions while in a business meeting.

Those looking for a clean communications tool might find the wealth of content a bit overwhelming. In addition, employees of small companies in need of a free communications tool might not immediately warm up to the idea of surrendering their Facebook profile as an official point of contact.

If there’s anything Facebook has proven, however, it is the ability to improve upon previous versions by adding or removing details to enhance user experience. Users may not always be pleased with the updates, but that hasn’t slowed down the billion-user company’s popularity. If the newly introduced desktop group video calling function proves efficient, Facebook might persuade even more users to lean towards its messaging tool.

With many options to choose from, the only thing left for people to do is discern which platform best serves their communications needs. For advice on which VoIP platform is best for your business, contact us today.

Published with permission from TechAdvisory.org. Source.

Topic Social Media
January 4th, 2017

2016january4_virtualization_bMigrating your business’s data, applications and other critical resources to the cloud requires time and a bit of money. Performing a large-scale migration to Amazon Web Services delivers many benefits and is a cost-effective solution that most businesses should adopt. Given its potential to increase your company’s efficiency, there are factors that need to be considered when moving to the cloud. Here are some of the most important ones.

Preparation for migration

  • Is everyone within the organization on board with this major move? Are your employees adequately equipped with knowledge about the cloud? And, since large-scale transfers involve big data, would your security framework be able to deal with potential security threats during the transition? Can your company handle the inevitable expenditure that goes with investing in the cloud? These are just some of the points you have to consider when preparing for large-scale migration.

Reasons for migration

  • One of the most compelling reasons to virtualize tech capital is the need to meet your business’s increasing demand for efficiency, which could lead to greater profitability. Other reasons could include change of organizational leadership or a shift in business structure that necessitates storage recalibration. Regardless of your reasons for migrating to the cloud, you as a business owner should have a clear understanding of why you’re doing it, and make sure everyone understands why it is so important.

Size of resources to be moved

  • Using Amazon Web Services’ cloud storage gives you the benefit of eliminating the costs of buying your own storage infrastructure and it introduces an element of anywhere-anytime access to your business’s data and/or applications. That said, you must consider how much you’ll be transferring, and use it as your basis for moving. Knowing the amount of IT resources you’re freeing up lets you allocate more cost-effectively and allows your technology staff to focus on more innovative pursuits.

Migration requirements

  • Which specific data, servers, or applications need to be migrated? Does your company need large-scale migration, or can it survive on moving only a small part of your resources to the cloud? Perhaps, a subsidiary could survive without having to be moved to the cloud. When migrating to the cloud, you’d be remiss not to think of these tiny details.

Impact to the business

  • Temporary downtime is something you have to be ready for. You might need more time or you might need to consider alternatives for the brief interruptions that come with migration, and of course budget can be a major factor in your decision to move. You can save your business from unnecessary obstacles by first assessing its ability to handle these situations.
Recalibrating the management of your technological resources for scalable storage solutions in a cost-saving platform is not without its challenges. Your business and its stakeholders’ call for greater efficiency cannot be ignored. After considering these factors for a large-scale migration, you might realize that despite a few minor bumps, the benefits to your organization will far outweigh the projected costs, and that there’s nowhere to go but up (in the cloud).
Published with permission from TechAdvisory.org. Source.

December 29th, 2016

2016december29_windows_bMicrosoft Edge has recently been changed and updated. While this is nothing unusual, what is unusual is the sudden choice to no longer be as Adobe Flash-friendly as it once was. The blocking of Adobe Flash by Microsoft's primary web browser can have significant repercussions for businesses and web users alike. As a business owner, you may wonder whether your website and your various bells and whistles will be affected by these changes. In order to understand what is going on with Microsoft Edge and its relationship with Adobe Flash, get to know more about the rationale behind the decision as well as how your business may be adversely impacted.

The primary purpose behind the recent changes made to Microsoft Edge is to make it more competitive with the popular Google Chrome web browser. Among efforts to do just that is the change to how Adobe Flash works on the Edge browser. Now, instead of Adobe Flash plugins playing and loading immediately when a person navigates a website, the application will be blocked.

An alert will come up near the address bar, letting users know that Adobe Flash has been blocked and will give the option to run the add-on or continue blocking it. For businesses that use Adobe Flash throughout their websites, this can be a frustrating change as visitors will need to take an extra step to access the full website.

However, there are numerous legitimate reasons for these changes to the Microsoft Edge browser. The most important of these issues is the fact that Adobe Flash is a security risk and is easily hackable, making it more likely for information and control to be lost to web users. Another issue is the fact that Adobe Flash is a big drain on battery life for computers and other devices.

The theory is that Adobe Flash is on its way out, and that newer, better systems are on their way in. As of now, Windows Insider users are the only ones with access to these updates, but soon the updates will go global and be made available to all users. In fact, Microsoft plans to eventually automatically load HTML5 web information first without loading Adobe Flash content at all.

Because so many sites use Adobe Flash, this can mean major renovations to existing web content. If you worry about the impact this will have on your business, contact us for immediate help and assistance in maximizing your website usability before these changes go live for all Microsoft Edge users.

Published with permission from TechAdvisory.org. Source.

Topic windows
December 28th, 2016

2016december28_security_bIt's scary to think you can be simply browsing the Internet when WHAM! a screen pops up out of nowhere claiming that you have been hijacked and will need to pay a bitcoin to free your computer. Unfortunately, ransomware like this is not uncommon. But now there's a new, more devastating virus that asks victims to pick other victims to replace them in order to get their computer information back safely. Read on to find out how Popcorn Time is turning the ransom game on its head -- and how you can protect yourself from it.

Ransomware is nothing new. Cybersecurity miscreants have been taking advantage of online users for years by requiring payment to "unlock" a victim's computer. What Popcorn Time does differently is give users the option to spread the virus to two other victims in the hopes that they will pay the ransom -- a tactic that promises to double their money at the expense of your sense of morality (and at the expense of your friendships as well).

The Cost of Popcorn

When you inadvertently download this ransomware, you will be met with a screen that explains that your files have been hijacked/encrypted, and that to get them back you will need to pay one Bitcoin for a decryption key that they keep stored remotely. The Bitcoin fee is usually more than $700, a hefty price to pay during any season but particularly difficult for those infected during the holiday season.

Spread the "Holiday Cheer" and Hope they Bite

What makes Popcorn Time unique is the option victims have to take their cost away by allowing the ransomware to affect two of their friends for a chance to get a free decryption code. Of course, it works only if both friends pay the ransom, which leaves you looking (and feeling) like the Grinch.

Avoiding Popcorn Time this Season

The easiest way to avoid downloading ransomware is to stay off of sites that might contain questionable files. However, this is nearly impossible for modern users, and many hackers are getting good at making their files look legitimate. Limit your exposure to potential ransomware by keeping your software up-to-date and your computer protected with a security program from a reputable company (for example Norton or Symantec). If you need to learn more about how to avoid running into ransomware while you're online, give our professional cybersecurity consultants a call. We'll keep you away from the popcorn this season.
Published with permission from TechAdvisory.org. Source.

Topic Security
December 22nd, 2016

2016december22_business_bAt a recent cloud technology conference, Amazon Web Services (AWS) launched a tool to help streamline data analytics in the cloud. This new tool, named “Glue,” is designed to help reduce the burden on engineers and employees so they can get down to the important elements of data analytics. Read on for an explanation of the AWS Glue and all the ways it can benefit your business.

Data analysis can be an extremely profitable arm of your business, if undertaken carefully. Much of what people consider to be data analysis for a business is actually just digital clerical work, which makes the process even more frustrating and time-consuming than it needs to be. At its core, AWS’s Glue is an app that automates this tedium, which is often referred to as ETL (extract-transform-load).

Third-party software already exists to help with this task, but the service from AWS is one of the first cloud-based alternatives to come to market. Glue is designed to work with businesses that have their own on-premises data centers and infrastructures in addition to working with AWS frameworks. In fact, if a business makes changes to on-premises data, Glue can be set up to trigger jobs and update the data in the cloud so users always have access to the most up-to-date information for use and analysis.

Essentially, AWS extracts various types of data from a wide array of sources and analyzes it, ultimately homogenizing the data to fit the business's existing database. This eliminates a great deal of work because the extremely tedious task of importing data is often done by hand. Handing this burden over to AWS allows businesses to focus on the real analysis work; saving effort, time, and money in the process.

Every day, data becomes more and more integral to building a successful company. And with such a heavy burden placed on this facet of business, falling behind on the technology that makes it possible is an expensive mistake. If you’re hosting large amounts of data on-premises or in an AWS database, contact us today about how you can eliminate costly ETL processes.

Published with permission from TechAdvisory.org. Source.

Topic business
December 20th, 2016

2016december20_virtualization_aVirtual containers have incrementally increased the ability of users to create portable, self-contained kernels of information and applications since the technology first appeared in the early 2000s. Now, containers are one of the biggest data trends of the decade -- some say at the expense of the virtual machine (VM) technology that preceded them. Read on to find out some of the performance differences between containers and virtual machines, and how the two can work together for your business.

When it comes to the virtual world, containers and VMs are not all that different. The VM is a good option for those who need to use more than one operating system in the course of a business project, while containers serve those who are comfortable staying within a Linux or Windows operating system without deviating. There are performance advantages to using containers, although these are counterbalanced by organizational advantages derived from a VM system.

Performance Nuances

VMs and containers both work from a virtual platform; therefore, the differences in performance relate to how they are configured and utilized by the people who maintain them.
  • Faster startup time: Containers don't have as much to start up, making them open more quickly than virtual machines. While it may not seem revolutionary, this can be up to a few minutes per instance -- a cost that adds up to quite a bit over the course of a year or more.
  • Resource distribution: Containers only need to pull hardware resources as needed, while a VM requires a baseline of resources to be allocated before it will start up. If you have two VM processes running at the same time, this might mean two of the same programs are pulled up even if they aren't being used.
  • Direct hardware access: A VM cannot pull information from outside of itself (the host computer), but a container can utilize the host system as it runs. This may or may not matter depending on what your users are doing, but certainly puts a point in the container column nonetheless.
Although it appears that containers out-perform virtual machines in most areas, there are uses for the VM environment, particularly for a business on the rise. With a virtual machine you have a security advantage because each VM environment is encapsulated with its own operating system and data configuration; additionally, you are not limited to the use of one operating system.

Virtualization is an incredibly tricky solution to grasp in its entirety. New avenues spring up all the time to get more use out of its benefits, and it might be tempting to take a “wait and see” mentality. In reality, one of the best things about virtualization is how adaptable it is as a business solution. We suggest you get into the game as soon as possible; give us a call so we can tell you how.

Published with permission from TechAdvisory.org. Source.

December 14th, 2016

2016december14_windows_bOne of the issues that face most users of Microsoft's latest operating system platform is the amount of time and processing power required to perform Windows 10 updates. This issue causes problems both for businesses and individual users alike, because the newest Windows operating system processes these updates automatically. However, Microsoft has come up with a solution to the slow-update problem, and it may actually save you a great deal of frustration.

What Microsoft is proposing to streamline for the Windows 10 update process is a system known as a UUP or a Unified Update Platform. A Unified Update Platform is essentially a large series of changes to Windows 10, all of which occur behind-the-scenes and will not affect overall user experience. These changes will work to reduce the amount of processing power required to update Windows as well as make the updates move faster for Windows 10 users who need to keep things moving along quickly.

This UUP ambition will be accomplished in a number of ways, including significantly shrinking the size of the update files for all devices, and especially, making the Windows phone update process much more streamlined than it currently is. One of the ways Microsoft proposes to streamline and speed up the update process is by sending updates that are device-specific rather than distributing a full bundle of updates together, some of which are not necessary for the device in question.

Currently, Windows 10 updates essentially overhaul the entire version of Windows 10 that users have on their device. This makes the update process easier on Microsoft, but not on users. Instead of this system, the UUP will eventually allow updates to occur only to the specific programs and systems that need updating, leaving the rest of the operating system untouched. Larger system-wide updates will also be much faster and more efficient with the UUP system in place.

Should you have further questions about what this new Windows 10 update process could mean for you and your business, contact us as soon as possible. We can help you with all your operating system needs.

Published with permission from TechAdvisory.org. Source.

Topic windows