29 de nov. de 2010

Wikileaks under DDOS attack on eve of new revelations [Comment]

Wikileaks under DDOS attack on eve of new revelations [Comment]: "

Tomorrow WikiLeaks is set to reveal a new series of classified US documents that have been sent by United States Embassies around the world. The revelations set to shed new light on the country’s relations with friendly and unfriendly states alike, and now the site itself has come under a distributed denial of service attack.

Despite reports that hackers are responsible for the attack there’s no actual evidence one way or another, but it’s raised important and pertinent questions about the information that’s in the public domain and if there really is such a thing as “too much information”.

Most people will agree that sometimes it’s just better not to know a fact, and to live in ignorance of it. The Human brain and conscience is only capable of processing so much which is why we choose to blank out certain events.

This was the case with the previous WikiLeaks revelations of documents from the Iraq war. Most people chose to ignore the event and, as such, it went away.

Revelations about private discussions and thoughts from US Ambassadors and envoys from around the world though could have a much more serious impact. This time it won’t matter if the public choose to ignore the documents, other countries, friendly or otherwise will be pouring over them and you can be certain that there will be ammunition in there for everyone.

Extracts from the revelations, to add fuel to the fire, are also set to be published in tomorrow’s papers including El Pais in Spain, Le Monde in France, Speigel in Germany, The Guardian in the UK and the New York Times.

The latest round of leaks covers diplomats confidential views on countries including Australia, Britain, Canada, Israel, Russia and Turkey and, according to WikiLeaks founder Julian Assange “covers every major issue in every country in the world”.

Most heads of state will be sensible enough to take such matters on the chin, but it’s the responses of fringe elemants in both friendly and unfriendly countries, and the roll on effects for years to come of the revelations that should cause the people of the world concern.

WikiLeaks has made its point, that the freedom of the Internet cannot be removed. Now Julian Assange needs to stop, and stop quickly before his actions, and the actions of his staff cause an incident that will cost even so much as a single life.

It’s all too easy for those of us who sit behind a screen making a living from the Internet to think only of ourselves, cocooned in our own safe little world. It’s harder to think that anything we say or do, or write, can have consequences for other people and perhaps even cause bloodshed… or worse.

This situation comes about because we’ve had the longest period of western peace in history and every day we see more and more countries working together towards common goals. People like Assange have never witnessed first hand the horrors of war or suffering. Consequently [we] they have no way to properly relate to it or understand it.

Some of the documents from Iraq exposed corruption and aided democracy. That’s great and I’m sure some of these documents will too, which is also great. I’m saying you need to be careful. The information about the Saudi’s wanting the US to bomb Iran’s nuclear facilities neither exposes corruption nor aids democracy. It’s just salatious information that can only have the effect of making any such move to disarm Iran less likely.

This is why people like Assange are dangerous. Power without responsibility is always dangerous and it is us, the people of the world, who will have to accept responsibility for this one man’s actions.

I don’t care tonight whether it is hackers or the US government that is trying to bring down this website. I can only hope that they succeed. Our freedoms, and the freedom of the Internet, do not need to come at such high a price as a person’s life.


© Mike Halsey for gHacks Technology News, 2010. | Permalink | Add to del.icio.us, digg, facebook, reddit, twitter
Post tags:

"

25 de nov. de 2010

10 tips for troubleshooting DNS problems

10 tips for troubleshooting DNS problems: "

Figuring out what’s wrong with DNS will go faster if you have a set of troubleshooting steps to follow. Brien Posey shares his approach to isolating the cause of DNS problems.





DNS is one of the most essential services on any Windows network. Active Directory can’t function without DNS, and it’s is also used by any number of other network functions. So it’s critical to troubleshoot DNS problems as fast as possible. Thankfully, the process is usually fairly easy. Here are10 of my favorite DNS troubleshooting techniques.


Note: This article is also available as a PDF download.


1: Verify network connectivity


When DNS problems occur, one of the first things you should do is verify that the DNS server still has network connectivity. After all, if the problem ends up being something as simple as a NIC failure, you can save yourself a lot of time by checking for the problem up front.


The easiest way to verify connectivity is to log on to the DNS server and try to ping a few machines. You should also try to ping the DNS server from a few random machines. Remember that ping will work only if you allow ICMP packets through the firewall on the machine you are pinging.


2: Determine the scope of the problem


After you have determined that basic connectivity still exists, the next step is to determine the scope of the problem. Are Internet name resolutions failing or are local name resolutions failing too? The answer is going to make a difference in how you will have to troubleshoot the problem. For example, if local name resolution works but Internet name resolution does not, the problem may lie with one of your ISP’s DNS servers.


3: Find out whether all users are affected


Another thing to look at is whether the problem affects all of the users on the network or it’s limited to a subset of users. If you determine that only some users are affected, check to see whether all those users are located on a common network segment. If so, the problem could be related to a router failure or a DHCP configuration error.


4: See whether the DNS server is performing load balancing


Organizations hosting high demand Web servers sometimes try to distribute the workload across multiple identical Web servers by using a load balancing technique called DNS Round Robin. The problem with this technique is that the DNS server has no way of knowing when one of the servers has failed. As a result, inbound traffic is still directed to all the servers in round robin fashion, even if one of those servers is offline. The result is intermittent connectivity problems to the load-balanced resource.


5: Check the DNS server’s forwarders


If you determine that local name resolution requests are working but Internet requests are failing, check to see whether your DNS server uses forwarders. Even though many DNS servers use root hints for Internet name resolution, some use forwarders to link to an ISP’s DNS server. And if the ISP’s DNS server goes down, Internet name resolution will cease to function as the entries in the resolver cache expire. If your DNS server does use forwarders, you can try pinging the server to see whether it’s online. You might also have to call the ISP to see whether it’s having any DNS issues and to make sure that the IP address you are using in your forwarder is still valid.


6: Try pinging a host


If name resolutions are failing on your local network, try pinging some of the servers on your network. Start out by pinging the server’s IP address. This will confirm that connectivity to the server is working. Next, try pinging by computer name and by the server’s fully qualified domain name.


If you can ping the host by IP address but not by name, check your DNS server to make sure that a Host (A) record exists for the host. Without a Host (A) record, the DNS server will be unable to resolve the host’s name.


7: Use NSLookup


One of the handiest tools for troubleshooting DNS failures is the NSLOOKUP command, which you can access from a Windows Command Prompt window. Simply type NSLOOKUP followed by the name of the host for which you want to test the name resolution. Windows will return the name and IP address of the DNS server that resolved the name (although the DNS server’s name is often listed as Unknown). It will also provide you with the fully qualified domain name and the IP address of the host you specified.


NSLOOKUP is useful for two things. First, it allows you to verify that name resolution is working. Second, if name resolution isn’t working, it allows you to confirm which DNS server is being used. Keep in mind that NSLOOKUP will list only the DNS server it initially connects to. If the name resolution request is forwarded to other DNS servers, those servers are not listed.


8: Try an alternate DNS server


Most organizations have at least two DNS servers. If your primary DNS server is having problems, try using an alternate. If name resolution begins working after you switch DNS servers, you have confirmed that the problem is indeed related to the DNS server and not to some external factor.


9: Scan for viruses


About a week ago, someone called me because every time they would try to visit certain Web sites they were redirected to a malicious Web site instead. I initially suspected a DNS poisoning attack, but ruled out such an attack because only one computer was affected.


The problem was that a virus had integrated itself into the TCP/IP stack and was intercepting all name resolution requests. Even though this initially appeared to be a DNS problem, the virus was ultimately to blame.


10: Reboot the DNS server


I know that it sounds like a cliché, but when all else fails, reboot the DNS server. I have seen several situations over the years in which name resolution stopped for an unknown reason but rebooting the DNS server fixed the problem.


Likewise, I have seen at least two examples of consumer-grade routers that have stopped forwarding DNS requests even though other types of traffic continue to flow. In one of these situations, resetting the router fixed the problem. In the other situation, the router had to be replaced. It was thought that the router might have been damaged by a power surge that had occurred a day before the problems started.


What else?


Do you have some other go-to tricks that help you zero in on DNS problems? If so, share them with your fellow TechRepublic members.








"

15 de nov. de 2010

Implement a single client backup strategy

Implement a single client backup strategy: "
For years my consulting office has managed backup routines for numerous commercial clients. It doesn’t matter who installed the backup software (it could have been us, a competitor, the manufacturer, etc.). It doesn’t matter who configured the backup application (it could have been a novice engineer, a CCIE, an MCSE, a 20-year industry veteran, etc.). It doesn’t matter who manufactured the backup software. It doesn’t even matter whether the client is using native backup software, third-party utilities, or disk-imaging tools. Backup operations frequently fail.

Possible causes for backup failures include:

  • Clients forget to replace a tape.
  • A user disconnects an external hard disk power cord in order to charge his cell phone.
  • Software hangs.
  • Media fails.
  • The power goes out.
  • Flash floods strike.

Regardless of the cause, there’s one thing your consulting firm can do to help eliminate one of the most important business continuity tasks: standardize, when possible, on a single backup application. Centralizing client backups using a single platform from a single manufacturer has numerous benefits.

Standardization benefits


By implementing a single backup application, your office’s engineers minimize the number and variety of strange ghost-in-the-machine errors that occur. When anomalies arise (such as application A proves incompatible with brand B tape drives, but only on 64-bit servers), once you discover and document the solution (potentially an obscure and ill-publicized patch), you’ll be better prepared to quickly resolve that issue the next time it occurs.

When deploying a single backup solution, when possible, engineers will become more familiar with the applications advanced capabilities. When technicians must split time familiarizing themselves with a variety of backup suites, it’s more difficult to master a tool’s advanced features.

Favoring a single backup solution also enables engineers to become intimately familiar with an application manufacturer’s technical support processes. The more systems are used, the better they’re typically navigated and leveraged. Occasionally you get lucky, too, and score the name and direct number or email address of a particularly savvy technical support representative. Trying to land such contact information for a variety of backup application providers is unlikely.

Don’t forget recovery


The reason we make backups is to recover systems when they fail. The more familiar a consultancy becomes with the recovery processes required of a specific backup application, the better.

My office has become so familiar with one leading brand of image backup software that we’ve actually recovered failed servers from total and fatally corrupted states to fully operational condition in less than three hours, including the time required to locate a new chassis. When your energies are as focused as the real-world reasonably allows, your abilities with a single platform tend to increase.

There will be exceptions


Occasions will arise when a sole backup application won’t adequately meet all clients’ requirements, and you’ll have to work with other software solutions. But when given a choice, and when a client asks for your best recommendation, you could do much worse than recommending a solution with which your office techs are intimately familiar, know inside and out, and are confident and quick when troubleshooting or recovering.




"

Speed Up Your Website By Optimizing Images

Speed Up Your Website By Optimizing Images: "
Page speed is a ranking factor in the Google Search engine. According to Google officials it is currently used in 1 out of 1000 queries. I think it was Matt Cutts who said that speed acted as a tie breaker in situations. It is however likely that speed will play a bigger role in the future. But it is not only the search engine marketing and visibility aspect that plays a role here. The Majority of visitors likes a fast loading website. Depending on value or need to access those contents they may be inclined to wait, or leave the page if it loads to slow.
Webmasters have lots of options to reduce the page loading time of their websites. This includes removing unnecessary scripts, using compression, minifying HTML, CSS and JavaScript files, merging files where possible but also to optimize images that are hosted on the server.
The difference between an optimized image and an unoptimized image can make a big difference in page loading times. Think of it this way: If you can halve the size of each image hosted on your web server, without reducing the visible image quality, then you have cut the image loading time by halve as well (well halve is not entirely correct but lets use that figure for the sake of the argument).
The two main image formats used on the web are png and jpg. Jpg images are usually well compressed and there is little to gain by reducing their quality further. The image quality drops significantly at a point.
Png images on the other hand offer lots of room for improvement, if they have been saved as true color png images. Lets take a look at the following two images for the sake of this argument.
aptitude main true color
aptitude main optimized
Do you see a difference in image quality? The second image’s size is 64 Kilobytes, that is 102 Kilobytes less than the size of the first image.

Using Riot to optimize images

You can use lots of different programs to optimize images. They all come with the capabilities but differ highly in their batch optimization capabilities. Riot is a free portable software that can process images in batch. (see Image Resizer And Optimizer Riot)
The program interface looks like this on startup. I have already made the relevant changes to the lower half. In particular, you need to switch to the PNG tab, select Optimal 256 Colors Palette, best compression (slow), NeuQuant neural-net (slow) and PNGout Xtreme (very slow) for the output files.
You then click the Batch icon at the top which opens an overlay window. Click on Add Images (or the small arrow next to it) to load images directly or by folder. Please note that you should only load png images. It does not help to convert jpg images to png, considering that they are still linked as jpg images on the web.
riot image optimizer
Make sure you select a second folder for the output images. A click on start optimizes all images loaded in the window.
Webmasters can then upload the optimized images to their web server.
Please note that the reduction to 256 colors may not work for all image types. It works well for screenshots and other images that we publish here at Ghacks.
The best option for WordPress based websites was to process one image folder at a time. WordPress saves image uploads in monthly folders. The whole process per folder was to copy all png images from a folder to the local system, add those images to Riot, process them in Riot and reupload them to the server in the same directory after comparing some of the input and output images.

© Martin for gHacks Technology News, 2010. | Permalink | Add to del.icio.us, digg, facebook, reddit, twitter
Post tags: , , , , ,

"

Netstat tips and tricks for Windows Server admins

Netstat tips and tricks for Windows Server admins: "
Netstat is a command that some Windows Server admins use every day, while others only use it when there is a problem. I fall into the latter category; I use netstat as a diagnosis tool when something has gone awry, or when I am trying to track something down.

The 10 parameters to the Windows netstat command can display scores of additional information for troubleshooting or everyday use. The most common iteration of netstat is to use the -a parameter, which displays all connections and listening ports. However, netstat displays useful information even without parameters. Here are some pointers on using the netstat command:

Fully qualified domain name: The -f parameter will display the fully qualified domain name (FQDN) of the foreign address in the netstat display. This will resolve names internally and externally if possible. Figure A shows the FQDN resolution within netstat.

Figure A



What process is running on the open port: Tracking down which process identifier (PID) has a port open is quite easy when netstat is run with the -a -n -o combination of parameters. Read my Windows Server 2008 tip on this sequence of commands, and see it in action in Figure B.

Figure B



You can take this one step further with the implementation of friendly names for each process with the -b netstat parameter. This parameter requires administrative permissions and is shown in Figure C.

Figure C



Note that the remote addresses pointing to the 192.168.1.220:3261 address are the Windows iSCSI initiator service and display differently than the other services listed.

Display routing table: If you need to determine why one system has a different experience than another on the same network, netstat can display a route of the current system with the -r parameter. Figure D shows this in use (note the persistent routes section that would display any static routes added to the Windows Server).

Figure D



These four netstat commands can greatly add to the troubleshooting efforts for Windows administrators. How else do you use netstat? Share your tips in the discussion.




"

Zoho Support adds a Help Desk to its Web-based office suite

Zoho Support adds a Help Desk to its Web-based office suite: "
Filed under: ,

Zoho -- the popular Web-based office suite -- has added another component to its customer support offerings. In addition to Zoho Assist, their Java-based remote assistance app, the company is now offering Zoho Support, a full-featured help desk application.



Support packs tons of essential features, from managing customer requests and contracts to maintaining tasks and contacts. There's also a powerful reporting feature so you can analyze your Support data. It's an excellent option for small or medium-sized businesses who need a way to track customer issues and problem resolutions. Access to the core help desk functionality is free for one help desk user and up to 25 incidents per month -- which might not seem like a lot, but should suffice for small-time freelancers. Paid plans start at $12 per month per user.



Want to see Zoho Support in action? Check out the official preview video after the break.
"

12 de nov. de 2010

This Day in Geek History: The Web Was Invented

This Day in Geek History: The Web Was Invented: "
It was exactly 20 years ago today that Tim Berners-Lee published the formal proposal to create the “WorldWideWeb”, probably for the sole purpose of one day allowing people to put moronic captions on pictures of cats.

Image from Wikipedia

The First Browser & Web Server


The first web server was a NeXT Computer, created by Steve Jobs’ company and running a Unix-based operating system named NeXTSTEP—it didn’t have a hard drive by default, came with a 1120×832 grayscale display, and it only cost $6500.

This same computer was also used to create the first web browser, appropriately titled WorldWideWeb and later renamed to Nexus to avoid confusion. It took him only two months to put together the first working version—after some more development, it was capable of displaying basic style sheets, downloading files, browsing newsgroups, and even spellchecking. Somebody should mention that last one to the Internet Explorer team…



Screenshot from Wikipedia, click on the image for the full-size version.

It wasn’t until August 6, 1991, that Tim Berners-Lee actually announced that the web was publicly available, on the alt.hypertext newsgroup, but it was 20 years ago today that the web was invented.

World Wide Web [Wikipedia]
"

10 de nov. de 2010

Rescue Disk Creator Sardu 2 Released

Rescue Disk Creator Sardu 2 Released: "

I have reviewed Shardana Antivirus Rescue Disk Utility (Sardu) back in 2009 and found it to be a helpful but complex to setup rescue disk creator. The developer has recently released version 2 of Sardu which sports new features and an improved user interface.

The portable software program offers a selection of tools that can be included in the bootable disk image or copied to an USB device. The utilities are grouped into the four tabs antivirus, utility, Linux Live and Windows PE.

sardu rescue disk creator

It is possible to select multiple programs of a group. The antivirus tab for instance offers to include the AVG Rescue Cd, Avira AntiVir Rescue System, Dr. Web Live Cd and the Kaspersky Kav Rescue CD among other choices. A click on a program downloads it to the local system, another click adds it to the disk image so that the rescue disk can be accessed whenever the system is booted from that CD, DVD or USB stick. A few rescue disks and programs need to be downloaded manually. A click on the program button launches a web page with the download option instead.

Sardu displays the total size of the current selection in the interface, there is however no indication of the size of individual tools. This would have been handy to make the selection process easier if there is a size limit for the rescue disk image.

The utilities group offers popular applications including Clonezilla, Gparted, Partition Wizard or the Ultimate Boot CD.

Linux Live CDs can be downloaded and integrated in the third group. Popular choices include Ubuntu, Damn Small Linux, Puppy Linux or BackTrack.

The fourth and final category lists options to include Windows PE on the rescue disk. Available are Windows PE 1.x, Windows PE > 2 and Windows Recovery Disks.

windows recovery disks

It is possible to download most of the disk images directly and place them in the ISO folder of the application. That can be helpful for users who want to download multiple rescue and system disks at once. Sardu can only download one file at a time, and downloading them externally can help speed downloads up. All images are linked on the Sardu project website.

Once the downloads and selections have been made it is time to create the iso image if the rescue disk should be burned on CD or DVD, or to copy the rescue disks to a connected USB device.

It is pretty easy to create a rescue disk with Sardu 2. The only problem that I have encountered during review is that some of the messages are in Italian, even if English is selected as the language. This has not been a big problem though as it was always clear what the notification meant.

sardu

Sardu is available for download at the developer website. (via)


© Martin for gHacks Technology News, 2010. | Permalink | Add to del.icio.us, digg, facebook, reddit, twitter
Post tags: , , , , , ,

"

9 de nov. de 2010

Canonical axing X Windows: What will it mean for the next version of Ubuntu?

Canonical axing X Windows: What will it mean for the next version of Ubuntu?: "

In yet another recent announcement that had the Linux community looking like the proverbial “deer in headlights,” Canonical has announced that in an iteration of Ubuntu it might very well drop X Windows in favor of Wayland. This comes on the heels of Mark Shuttleworth’s recent announcement that 11.04 would see Ubuntu leave behind the GNOME Shell in favor of Ubuntu Unity. That was a tiny drop in the bucket compared to this latest consideration.


Think about it - X Windows. How long has X Windows driven the desktop for Linux? Maybe since Linux had a desktop? This is HUGE! The very thought of a Linux without X Windows is staggering considering that nearly every application will have to be rewritten. Is the next announcement to be that Canonical is no longer going to ship with current Linux applications, but is instead going to develop applications on their own?



  • Ubuntu Web browser.

  • Ubuntu Email client.

  • Ubuntu Graphics editor.

  • Ubuntu Office suite.

  • Ubuntu Shell.


The list goes on and on and on.


I could understand Shuttleworth’s previous announcement. He’s obviously making a grab for an interface that will be more in line with the future of the desktops - specifically multi-touch and tablet PCs. That’s all good. I respect that. But Wayland in favor of X Windows?


“What is Wayland?” you ask?


Wayland is a display server for Linux desktops that was created by one of Intel’s open source technologists. Wayland’s stated goal is ”every frame is perfect, by which I mean that applications will be able to control the rendering enough that we’ll never see tearing, lag, redrawing or flicker”. But if you dig further, you find that Wayland is actually “a protocol for a compositor to talk to its clients as well as a C library implementation of that protocol. The compositor can be a standalone display server running on Linux kernel modesetting and evdev input devices, an X application, or a Wayland client itself. The clients can be traditional applications, X servers (rootless or fullscreen) or other display servers.”


The key in the above text (taken from the Wayland web site) is compositor. If the Unity/Ubuntu marriage is going to be as much of a success as Shuttleworth needs it to be, it is going to need a strong compositor built in to the system. The last thing this marriage needs is to require a third party compositor. That would blow the lid off Shuttleworth’s plan for ease of support for manufacturers. With a compositor built into the X Server, things ease up quite a bit on the technical support front.


Remember, one of the reasons Shuttleworth made the move to Unity was so that PC manufacturers like Dell would have an easier time supporting Linux. With a similar desktop interface across the board, companies wouldn’t have to worry about GNOME, KDE, Enlightenment, etc. Instead there would be Unity and that’s it.


Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

I suspect that Shuttleworth has some grand plans with Wayland that will unfold for the average eye in the months leading up to the release of 11.04.


My initial reaction from this announcement was quite the eye-opening shock. After all, X Windows has been the driving force behind the Linux desktop since Linux had a desktop. But it has been my opinion that Mark Shuttleworth has had the Midas Touch with Linux and you can’t shun his ideas until they are proven wrong (if ever they are).


Canonical has made some seriously bold steps in recent weeks. These bold steps will hopefully take Linux (at least Ubuntu Linux) to heights it has never seen before. I applaud Shuttleworth for these actions. Whether you like them or not, they speak volumes for the dedication that Canonical has for the Linux operating system.





"

8 de nov. de 2010

Soft2Base, Download, Install Multiple Freware Apps At Once

Soft2Base, Download, Install Multiple Freware Apps At Once: "

I sometimes take a look at PCs of friends and family, usually because something is not working and they asked me to, and to a lesser extent when they bought a new PC. It is often the case that the PC is not equipped with the right programs, no solid web browser, no text editor, data transfer program, security program and so on.

One of the tasks that I do over and over again is to install the programs that I came to rely on on those computer systems. For that I open the program’s website, download it and install it afterwards. This takes some time, considering that it is not usually just one application that needs to be made available, but many.

Soft2Base is similar to previously reviewed applications like AllMyApps, Essential Software Installer or Ninite, by offering to download and/or install selected applications at once.

The program displays a set of configuration options on startup. Here it is possible to select the application language, program interface language and download / installation options. It is for instance possible to configure the program to perform silent installs, install only newer applications or download the programs only without installation.

soft2base

The next screen displays the supported programs sorted in categories. A pulldown menu on top offers templates, like complete, minimum or custom, that auto select applications in the listing.

application installer

Many popular applications are available, from iTunes and VideoLan in the multimedia category, over Open Office and Notepad in the Desk category to Firefox, Flash Player, Avast, 7-Zip or CCleaner.

The selection is good, but several applications that I’d personally install are missing. The incredible burning software ImgBurn is missing, as are Google Chrome, Opera and True Crypt among others.

Still, the available applications can be put to use, and all it takes to download and install them on the system is to select them and proceed by clicking on the Install button. I have checked the versions of some applications and they were always the latest available versions.

An installation log can be displayed at the very end of the process.

Soft2Base Download and Compatibility

Soft2Base is available for download at the developer website. The portable software is compatible with most Windows operating systems, including Windows XP, Windows Vista and Windows 7.


© Martin for gHacks Technology News, 2010. | Permalink | Add to del.icio.us, digg, facebook, reddit, twitter
Post tags: , , , , ,

"

BOFH: You just can't go around killing people

BOFH: You just can't go around killing people: "

Episode 14 What do you mean why? 'Cause you can't



Bot Wars IV - The Screenplay

[Black Screen]

Several screens of multi-coloured static flash by before the words:

INITIAL TESTS INDICATE

UNIT OK

appear in large tasteless block letters on the screen. Another burst of static crowds the screen before a Camera image appears with the top half of the PFY’s upside-down face …

"

Q: What’s your Windows template approach?

Q: What’s your Windows template approach?: "

Once upon a time, I was a Windows Server administrator. Most of my focus was on Windows Server deployment and management. VMware virtualization was a large interest but my Windows responsibilities dwarfed the amount of time I spent with VMware. One place where these roads intersect is Windows templates. Because a large part of my job was managing the Windows environment, I spent time maintaining “the perfect Windows template”. Following were the ingredients I incorporated:


















































Applications
Adobe Acrobat ReaderAdvanced Find & ReplaceBeyond Compare
DiskeeperMS Network MonitorMS Resource Kits
NTSEC ToolsLatest MS RDP ClientSymantec Anti-Virus CE
MS UPHCleanVMware ToolsWindows Admin Pack
Windows Support ToolsWinzip ProSysinternals Suite
Windows Command ConsoleBGINFOCMDHERE
Windows Perf AdvisorMPS ReportsGPMC
SNMP


















































Tweaks
Remote Desktop enabledRemote Assistance disabledPagefile
Complete memory dumpDIRCMD=O env. variablePATH tweaks
taskmgr.exe in startup, run minimizedSNMPDesktop prefs.
Network icon in System TrayTaskbar prefs.
C: 12GBD: 6GB
Display Hardware acceleration to Full*
* = if necessary


VMware virtualization is now and has been my main focus going on two years. By title, I’m no longer a Windows Server administrator and I don’t care to spend a lot of time worrying about what’s in my templates. I don’t have to worry about keeping several applications up to date. In what I do now, it’s actually more important to consistently work with the most generic Windows template as possible. This is to ensure that projects I’m working with on the virtualization side of things aren’t garfed up by any of the 30+ changes made above. Issues would inevitably appear and each time I’d need to counter productively deal with the lists above as possible culprits. As such, I now take a minimalist approach to Windows templates as follows:











Applications
VMware Tools






























Tweaks
C: 20GBVMXNET3 vNICActivate Windows
wddm_video driver*Disk AlignmentDisplay Hardware acceleration to Full*
* = if necessary


In large virtualized environments, templates may be found in various repositories due to network segmentation, firewalls, storage placement, etc. As beneficial as templates are, keeping them up to date can become a significant chore and the time spent doing so eats away at the time savings benefit which they provide. Deployment consistency is key in reducing support and incident costs but making sure templates in distributed locations are consistent is not only a chore, but it is of paramount importance. If this is the scenario you’re fighting, automated template and/or storage replication is needed. Another solution might be to get away from templates altogether and adopt a scripted installation which is another tried and true approach which provides automation and consistency, but without the hassle of maintaining templates. The hassle in this case isn’t eliminated completely. It’s shifted into other areas such as maintaining PXE boot services, maintaining PXE images, and maintaining post build/application installation scripts. I’ve seen large organizations go the scripted route in lieu of templates. One reason could simply be that scripted virtual builds are strategically consistent with the organization’s scripted physical builds. Another could be the burden of maintaining templates as I discussed earlier. Is this a hint that templates don’t scale in large distributed environments?


Do you use templates and if so, what is your approach in comparison to what I’ve written about?


Post from: boche.net - VMware Virtualization Evangelist


Copyright (c) 2010 Jason Boche. The contents of this post may not be reproduced or republished on another web page or web site without prior written permission.

Q: What’s your Windows template approach?


Related Posts

"

4 de nov. de 2010

10 applications you can move to the cloud

10 applications you can move to the cloud: "

Providers have made great strides in improving the security and reliability of cloud services — so much so that Justin James sees a number of areas where moving applications to the cloud makes a lot of sense.





Until recently, I was not a big fan of putting mission-critical applications in the cloud or letting someone else provide them. I had been burned too many times by shady vendors or providers who just did not have their acts together. But in the last few years, things have changed. There is a new breed of application vendors out there who have certain application classes nailed down really well and have established reputations for reliability, security, and fairness. It’s a good time to take a look at the cloud again. Here are 10 applications that can be moved to the cloud.


Note: This article is also available as a PDF download.


1: Email


Email is the lifeblood of many organizations, and as a result, many companies are not willing to let go of it. That is understandable. But hosted email providers have moved beyond the days of packing 5,000 mailboxes belonging to 300 accounts onto a cheap computer running with a basic POP3/SMTP setup. While basic email service is still out there, you can get hosted Exchange services from a variety of vendors (if you need it), as well as some upscale, non-Exchange offerings. Email architecture has become quite standardized, and there is really no value-add to keeping it inside your firewall other than mitigating regulatory concerns.


2: Conferencing software


Setting up and maintaining conferencing software is not fun. To make matters worse, when it is down, it needs to be up in a hurry. Like email, there is zero benefit to locating this within your firewall. Also like email, the setup and configuration can be complex enough to require an expert, unless you don’t mind tying up a staff member for a few days. For a low monthly or yearly fee, this weight can be off your shoulders. No one will notice or mind, and your staff can move on to other tasks.


3: CRM


The decision to outsource CRM can be scary. After all, like email, CRM is where so many of the company’s crown jewels are stored. But there are no technical benefits or advantages to having CRM in-house. It’s a fairly low bandwidth application with maintenance overhead you do not need. In addition, the licensing of many CRM systems can be a hassle. Moving to a hosted CRM system can free you to spend more time on more important issues.


4: Web hosting


Hosted Web space used to be as awful as hosted email, unless you were willing to spend big bucks on a dedicated server. Many vendors have shifted to (or offer) a virtualized hosting environment, which has dramatically increased uptime, reduced security risks, and allowed them to provide much more open and direct access to the servers. This is great news, especially for companies with custom applications that require a deployment path beyond copying some files over.


5: Development test labs


Building and maintaining test environments for software developers is a major undertaking. To do it right, you need all sorts of permutations of operating systems, patches, and relevant applications. You could easily find yourself with nearly 100 test beds for a simple Web application, for example. Why do this to yourself when there are quality vendors out there who already have these test systems set up or that allow you to configure them with point-and-click ease? And you can safely give the keys to the development staff and know that they can’t permanently mangle the test systems, too.


6: Video hosting


A few years ago, I was down on the idea of using the common video sites to host your video. Many companies would block those sites under the assumption that they were only for games, there was the real fear of having ads placed on your videos, and often the quality would be compromised. Now, the big name sites have upgraded their quality and few companies block them because there is plenty of legitimate usage. In addition, some sites allow you to pay a fairly low charge to give you more control over your video, like deciding where it can appear and enhancing its quality.


7: Email security


Even if you do not put your email with a hosted vendor, you will want to look at having a third party perform your anti-spam and antivirus duties, even if it’s only as a first-line defense. If you look at how much incoming email is spam, you’ll see that you can reduce your bandwidth needs dramatically by allowing a third party to perform an initial scan of your email. It will also allow you to have far fewer email servers. Speaking from personal experience, even a small company can have its email servers and network overwhelmed by incoming spam. Getting a good spam scanner outside the network can make a night-and-day difference.


8: Common application components


There is always the perpetual “build” vs. “buy” question for development projects, but the cloud adds a new wrinkle. Many functions that used to be the purview of components or libraries you could buy are now being made available as Web services, typically billed on a per-usage basis. Often, these services take a variety of lower-level functions and roll them into a complete, discrete offering. You would be surprised at how many of these Web services are available, and depending upon your usage scenario, it could make a good deal of sense to use them instead of rolling your own.


9: Basic office applications


If you need the full power of the Microsoft Office suite, by all means, this isn’t for you. But if you are one of the many organizations that use only a small fraction of the Office feature set, it may make sense to look at one of the new crop of online Office replacements (or even Microsoft’s online version of Office). I honestly never thought the day would come when this was possible, but it is a legitimate possibility for some companies. Just be honest with yourself before making this move and work closely with your users, since this directly affects so much of their workday.


10: Batch processing applications


One type of application that shines in the cloud are batch processing applications, such as data warehouses. As long as you can get the data needed into the cloud without disrupting your operations (such as “seeding” it with the shipment of physical media or synchronization over time), the ability to rapidly scale capacity in the cloud can result in tremendous savings. For example, if you need 15 servers’ worth of computing capacity for a once-per-week job, do you really want to have 15 servers sitting idle in your server room waiting for that? Or would you rather just modify the task to start 15 cloud instances, run the process, and shut them down again? In a scenario like this, it is clear that cloud computing can deliver significant advantages.








"

Could You Live Without Internet Connectivity?

Could You Live Without Internet Connectivity?: "

It’s hard to believe that the web as we know it is still a teenager. No other teenager in all of history has had such a massive impact on life. Throughout world history, technology has fostered change in human society but never at such a rapid pace as the changes today. Like a person aging or gaining weight who fails to notice when they look in the mirror each day, society moves blindly forward, oblivious to the changes that are unfolding in it. How can we see just how far reaching the effects are? How can we observe just how far things have progressed? Simple. Strip away the web, one device at a time and see how that simple imaginary task would change the ways that those devices are used. Like a fat man forced to see that his clothes don’t fit, if we can’t use the devices, we see the changes.

What specific web-enabled devices would be impacted or changed if they didn’t have internet connectivity?

While your first impulse might be to say that the phone and computer would be the only things affected by a lack of connectivity, the truth is that a great number of devices rely on connectivity for functionality. Many of the hottest products on the market today would become worthless without the connectivity that makes them so attractive. What are they? The list is giant. Home computers, office computers, laptops, netbooks, iPads and other tablets, MP3 players, iPhones, Android phones, smart phones of all kinds, GPS devices, Kindles and other e-readers, GPS devices, mapping programs, classified ads such as craigslist, news sites such as CNN, game devices such as the WII and XBOX, and even new additions such as internet televisions. No connectivity and we might as well go back to the devices of the past. While this article will not examine all of the connected devices in society, it will look at those which have the most impact on our daily lives.

Let’s start with the most simple. Imagine that your home computer suddenly lost the ability to access the internet. Sure, there are still plenty of interactive programs to use on it, but lets be honest, the vast majority of time spent on home computers is spent writing or answering email, chatting on Facebook, MSN, or Yahoo, making calls using Skype, or simply watching videos on YouTube and browsing the web.

xkcd internet (via)

Without the internet, your computer once again becomes a word processor. Not much more than a glorified typewriter. You can play video games, but not interact and frankly, playing on the WII is a better venue than the PC. Your computer would move back to the dusty corner it occupied in the early 1990′s and would be used to write, work on spread sheets, and maybe you would still use it to listen to music which would all have to be imported through CDs! No more downloads, no more file sharing, no more researching, and no more viral videos. Not only that, but no more Facebook or email. You would have to sit down and write letters to the people you loved or call them. In fact, you might even prefer to sit and hand write a letter instead of typing it if the instant gratification of email were to disappear. No more World of Warcraft, Yahoo answers, Google search, or Wikipedia. If you want to buy the Encyclopeidia Britannica CD-ROMS though, you might be able to find them in a second hand store.

How would the loss of connectivity affect how you use your home computer or other devices?

The truth is that our society has changed so much over the last decade because of the advent of connected technology that like the fat man in the mirror, we are sometimes oblivious to it. Young people have never known what it is like to receive a hand written letter or in some cases to buy a book!

Some sources say that the average western person spends more time online than in being engaged in any other activity!

The amount of time most people use their computers would shrink to something like a few hours a week from a few hours a day. Significant, yes. Suddenly, people would be spending time together (in person) and have to actually move around to experience a change in environment. Odd.

Next let’s look at mobile computing devices. Laptops, iPads, and Netbooks. Let’s face it, without connectivitiy, most people wouldn’t bother to carry their machines around with them. Sales on all three items would plummet. You would have to actually visit a brick and mortar store in order to download movies, music, or books onto your devices. No more email, gaming, browsing , or chatting. Unless you are a writer or a person who uses a significant amount of offline time to begin with, chances are that you wouldn’t bother carrying the laptop with you anywhere. Once again all of the uses outlined for the home computer apply here but in a mobile way. Let’s face it, we like laptops and other carry around computers because they let us connect. If you took away the connectivity, you would find that most users would choose to leave their machines at home or not to buy them in the first place.

Moving on to the Kindle. Would it make any sense to have a kindle if you had to go to a brick and mortar shop every time you wanted to buy a new book to load on it? Wouldn’t most people simply buy the books in the old fashioned print form? The Kindle would be nothing more than a curiosity without connectivity.

In terms of the WII and XBOX, people would still use them, but the interactivity is a significant reason why people upgraded from their old Nintendo 64s.

Now, what about phones? How would usage change if there was no connectivity to you iPhone or Android mobile. You would have to buy new apps from the store. No more web, no more email, no more maps, no more GPS. Would the iPhone have been a success without the connectivity of YouTube, iTunes, and email? Probably not. The phone would once again be a device for talking to people. Sure, you could still play games, listen to music, or use the calculator. The camera would still work, but in today’s world, the number two use of phones is email and internet. This would disappear.

Other devices that would suffer from a lack of connectivity are MP3 players and GPS devices. Without connectivity, they become not so incredibly useful any longer. You can’t download music or access maps without a connection. Would we go back to vinyl and cassettes? Probably not, but we would certainly see stores like Tower Records reemerge in retail areas.

To sum up, if we were to lose connectivity, the world as we know it would revert back to a sort of 1990′s form in which people didn’t know as soon as their friends changed their relationship status, music, videos, and games would have to be bought in actual shops, and people would have to spend more time with one another – in person. Now that I think about it, it doesn’t sound so bad.


© Brian Welsh for gHacks Technology News, 2010. | Permalink | Add to del.icio.us, digg, facebook, reddit, twitter
Post tags: , , ,

"

Free Backup Software – Best Windows Backup Software Programs

Source: http://www.ghacks.net/2009/04/26/the-10-best-windows-backup-software-programs/

Backing up data regularly should be one of the most important tasks of every computer user; Yet only a minority is doing it thoroughly and regularly. The rest is flirting with disaster as there are numerous incidents that can delete data on computer systems. The most common ones are hardware failures, which can mean damaged hard drives but also (partially) unreadable CDs or DVDs, computer virus attacks but also human error. If you have ever met someone who partitioned the wrong hard drive you know that the latter can be cause for great frustration.
Backups are the single most effective method of preventing data loss on computer systems. The following article will list the 10 best Windows backup software programs.
Cobian Backup
cobian backup
cobian backup
Cobian Backup is a free backup software for personal use that supports both local and remote backups. The software is constantly under development by the developer which means that features are included regularly. Some of the key features include full, differential and incremental backups, file compression including the popular 7-zip format plus strong encryption for data security.
DeltaCopy
windows backup software
windows backup software
An Open source backup solution supporting incremental backups, one-click restore options, task scheduling and email notifications. Some advanced options include ssh tunneling and connecting to rsync daemons.
It makes use of a server client system. One or multiple backup servers can be created on computers running Windows by installing the server version on these computer systems. The client will be installed on the any computer system where files should be backed up regularly.
The free backup software supports authentication, scheduling and connection by IP or hostname.
Cucku Backup
cucku backup
cucku backup
Cucku Backup calls itself social backup. It provides local backups but also backups on friend?s computers. It supports complete and continuous backups and can automate the remote backup process to take that of the user’s shoulders. All files that are backed up use encryption algorithms to ensure data safety and integrity so that the backup partners cannot see the file names nor contents of the files that are send over to their computer systems.
It works best in computer networks but can also work over the Internet if enough time or upload bandwidth is provided.
Backup Maker
The free backup software Backup Maker provides extensive backup capabilities. It comes with a standard and expert mode to suite both the needs of experienced and inexperienced users. The software supports full and partial backups, local and remote backups, selection of a backup execution interval and execution on certain events like USB detection or start and shutdown.
All in all an excellent software program to backup files on the Windows operating system.
Ace Backup
ace backup
ace backup
Powerful free backup software that supports multi-versioning, backups to local and remote locations, file compression and encryption.

Microsoft SyncToy

synctoy
synctoy
SyncToy is offered by Microsoft as a free download for the Windows operating system. It can be used to easily synchronize data between computer systems. The program offers five different synchronization options that the user can choose from including a preview option before starting the process.
Mozy
mozy
mozy
Mozy is an online backup solution that offers backup clients for Windows NT based systems and Mac OS X. Every registered user receives 2 Gigabytes of free space with the option to sign up for a paid account for currently $4.95 that offers unlimited backup space. Several pre-configured backup sets are populated after installation including bookmarks, documents and multimedia files. These can but do not have to be backed up. Expert mode provides access to the full file system to pick files or folders to backup directly.
Personal Backup
personal backup
personal backup
A free backup software for the advanced user that provides great file filtering options. It comes with the usual set of features including local and remote backup creation (including SFTP), file compression and encryption, status reports and log file generation.
Allways Sync
allways sync
allways sync
Another software that has been primarily designed for file synchronization that is also supporting file backups to a local drive, over a local network or the Internet. It works on a per directory basis and can be installed on as many computer systems as the user desires. The software comes as a setup or portable version.
Comodo Backup
comodo backup
comodo backup
Another free backup solution for Windows users. Comodo Backup can backup files and folders on a local computer system to other drives, network locations, ftp servers and removable media. Backups can be scheduled and notifications can be send to inform about completed backup jobs. Other features include compressing backups, data recovery options, support of multi-session backups and incremental backups.
DriveImage XML
driveimage xml
driveimage xml
DriveImage XML is more a drive imaging software than a backup program. It can however be used to backup a full hard drive or partition to another drive. It uses the Volume Shadow Service to create exact backups during runtime. It is afterwards possible to restore the backup either from within Windows or with the use of a boot disk.
The verdict:
The choice of the correct backup software depends on several factors including the data size, the frequency of backups or the local computer infrastructure.

An Apache Mod. By Google For Faster Websites

An Apache Mod. By Google For Faster Websites: "
Google is so decisive on making the web faster with the tools and resources it creates. Also remember that they had announced "speed being a factor in Google's search rankings".

Google is now sharing an open source Apache module named mod_pagespeed that automatically optimizes web pages.

mod_pagespeed

It works with Apache 2.2 and includes several filters that optimize JavaScript, HTML, CSS stylesheets, JPEG and PNG images.

For the other members of the Page Speed family like the Page Speed extension, check this.

"

3 de nov. de 2010

Review: Weezo 2.1 remote access application

Review: Weezo 2.1 remote access application: "

There are so many ways to gain remote access to your computer. From the over-simplified free software to the paid solutions to the VPN solutions and more. Where there is a will to gain access there is a way to gain access. However, one of the available tools takes a different approach. Instead of gaining full access to your machine, you gain access to your documents by way of a Web-based server. This tool is Weezo and it (basically) turns your computer into a secure Web server that allows secure access to files, simplified sharing of files, and even quick and easy Web publication of your content and desktop.


System requirements



  • Product: Weezo

  • Operating systems: Windows 2000, XP, 2003, Vista, 7

  • Display capable of 32-bit display

  • Broadband or better network connection


Who’s it for?


Weezo is for anyone who not only needs remote access to their machine (via a Web browser) but also has an interest in sharing out the content to other users or even publishing the content onto the Web. Publishing can be done using the free, permanent URL attached to an account (a free account must be created for this) or to popular Web sites. Weezo is not only a possible solution for home users (which it seems to focus on), but a solid entry into the remote category for business users. Since data transfers are not handled through a third-party affiliate, data privacy is ensured.


What problem does it solve?


The ability to share files and gain remote access to those files can be a task that is overwhelming to some users. Weezo makes this task surprisingly simple (considering how complex the system really is). The tool is easy to use from installation to usage.


Key features




  • Data privacy

  • Create groups for sharing and limit each groups access

  • No file size restrictions

  • Universal access from any browser from any network connection

  • Share/publish photos, music, files, etc.

  • Permanent URL associated with account

  • RSS reader

  • Basic blogging tool

  • Chat with groups and users associated with your account

  • Gain full access to your remote machine

  • Share files with torrents


What’s wrong?


The biggest problem with Weezo is that most larger enterprises are not going to be terribly happy with their users turning their desktops into Web servers - regardless if the integrated Web server is secure or not. The idea of file sharing and even, in many instances, remote desktop access isn’t nearly as bad as the idea of having any server running on a corporate desktop. Outside of that major issue, your machine will have to have port 80 traffic routed to it, and this too can be a major issue on any larger business network (especially if that LAN already routes port 80 traffic to another machine).


Competitive products



Bottom line for businesses


If your company is okay with your setting up your desktop machine to serve as a secure Web server in order for you to gain access to your desktop from remote locations, then I say go for it with Weezo. The features this application adds to the standard remote access tool make it an easy sell for normal users. The only issue will be if your company is okay with what Weezo does to open up your desktop. In theory it’s secure…but in theory many things said to be secure but actually have vulnerabilities.


User rating


Have you encountered or used Weezo? If so, what do you think? Rate your experience and compare the results to what other TechRepublic members think. Give your own personal review in the TechRepublic Community Forums or let us know if you think we left anything out in our review.


Read our field-tested reviews of hardware and software in TechRepublic’s Product Spotlight newsletter, delivered each Thursday. We explain who would use the product and describe what problem the product is designed to solve. Automatically sign up today!


Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.
"