Oracle Linux: Unable to connect to the internet

When first installing Oracle Linux, you may run into being unable to connect to the internet. You may make the mistake of thinking that the problem is with the network settings on the host-side, and try to futz about changing what type of network connection the VM connects to. Don’t bother, it isn’t going to do any good, and you will only get annoyed.

The problem is caused on the client side, and is simple enough to solve; here’s how:

  1. On the VM, go to System > Preferences > Network Connections
  2. Highlight System eth0, and click Edit
  3. Check the check box marked “Connect Automatically”
  4. Click “Apply”

You’ll get prompted for a root password, then you’re done, and you should now be able to connect to the internet.

The Cloud Storage Price War is upon us

Back in March, Google announced that they were massively slashing the storage prices for their customers. Previously $4.99/month for 100 GB, and $49.99/month for 1 TB, they cut the prices to $1.99 and $9.99, respectively. Beyond that, they are charging $99.99 per 10 TB, meaning that 40 TB will set you back $399.96. Now, in and of itself, that is interesting, but when compared to the pricing of competitiors, it gets downright impressive.

For example, Dropbox charges $9.99 per month for 100 GB while Apple’s iCloud service runs you $20 per year (or just about $1.67 per month, billed annually) for 10 GB.

Apple and Google have their market dominance funnelling users to them; Apple because iOS users have the option of using iCloud to back up their iOS device, and Google because GMail users use Google Drive for email already, Dropbox has no such funnelling of users and income. Hence, their lack of movement on the pricing front seems ill adviced to me.

Dropbox are banking on their existing user base staying faithful, while attempting to innovate on whatever front they can. The only place where they have an advantage, and only over one of their competitors, is the iOS app’s automatic photo upload feature, which I would expect to see implemented by Google shortly.

Unless Dropbox find a way to remain relevant, I don’t think they will last long. Time will tell.

Run Control Panel applets with elevated permissions

Sometimes, you want to launch a Control Panel applet with elevated permissions. Normally, you would right-click the program you want, and select to run as an administrator. However, the control panel applets don’t give you that option, and so we need to go deeper.

As it turns out, the Control Panel applets are all located at C:\Windows\System32, and denoted by the .cpl file type. Simply right-click the one you want, and off you go. Alternatively, you can open an elevated command prompt, and run the command command applet.cpl, where applet.cpl is the name of the applet you want to run. What are those names, you ask? Here you go:

   Control panel tool             Command
   -----------------------------------------------------------------
   Add/Remove Programs            control appwiz.cpl
   Date/Time Properties           control timedate.cpl
   Display Properties             control desk.cpl
   Fonts Folder                   control fonts
   Internet Properties            control inetcpl.cpl
   Keyboard Properties            control main.cpl keyboard
   Mouse Properties               control main.cpl
   Multimedia Properties          control mmsys.cpl
   Network Properties             control netcpl.cpl
   Password Properties            control password.cpl
   Printers Folder                control printers
   Regional Settings              control intl.cpl
   Sound Properties               control mmsys.cpl sounds
   System Properties              control sysdm.cpl

Note that this list is in no way exhaustive, but it should do for most applications

Force uninstall when installer is unavailable

Imagine the scene; you are having a problem with a program, and the manufacturer tells you that the solution is a complete uninstall followed by a reinstall. You go to uninstall, and Windows tells you that it can’t find the installer. Looking around, nor can you. So, now you’re up a certain waterway without a certain rowing implement, aren’t you? Not necessarily.

Luckily, Microsoft has created a tool which automatically finds the registry keys in question, and lets you remove them. The tool is called FixIt, and is intuitive to more or less a fault. Keep in mind that the tool does not support the runas command, and must be run by a user that has local administrative privileges.

Tesla: All our patents are belong to you

In a move that is at once impressive and baffling to many commentators, Tesla Motors CEO Elon Musk recently announced that they are applying the open source philosophy to their portfolio of patents. While a great publicity stunt, there seems to be more to it than that. Musk said:

At Tesla, however, we felt compelled to create patents out of concern that the big car companies would copy our technology and then use their massive manufacturing, sales and marketing power to overwhelm Tesla. We couldn’t have been more wrong. The unfortunate reality is the opposite: electric car programs (or programs for any vehicle that doesn’t burn hydrocarbons) at the major manufacturers are small to non-existent, constituting an average of far less than 1% of their total vehicle sales.

At best, the large automakers are producing electric cars with limited range in limited volume. Some produce no zero emission cars at all.

By open sourcing their patents, they do run the risk of losing market shares. That is, I think, a minor concern – if it is one at all – for the company. I think one of two things will happen: Either no one will use the patents, at which point we are where we were – status quo ante. If the patents are used, Tesla’s burden in building an ecosystem around their technology is being shared by other manufacturers. Either way, by letting anyone use the patents, more smart people can do more smart thing with them, improving the electric car market – for everyone.

Oracle Linux: Insufficient memory to auto-enable kdump

Remember how I said “you should be all set”, last week? Turns out, I was only partially right. After creating a local user account, Linux also configures kdump, the kernel crash dumping mechanism. When attempting to do so, it returned this error message:

Insufficient memory to auto-enable dump. Use system-config-kdump to configure manually

As problems go, this one is fairly minor. Kdump does not need to be enabled or configured for everything else to work. I made a note of the issue, and clicked right through the error, and logged in. That said, I really don’t like leaving stuff like that without looking into resolving it. I immediately grokked that system-config-kdump refers to a shell command, opened terminal, and entered the command. It opened this window:

KDump config window

I proceeded to click “Enable”, and then “Apply”. The system asked for a reboot, and then prompted for the root password, three times. After that, it returned the following error:

Starting kdump:[FAILED]

At a hunch, I increased the kdump memory to 160 MB, then tried to apply once more. After being prompted for root password thrice, the settings were saved. Peculiarly, kdump memory was set to its original 128 MB. Following a reboot, I checked, and kdump was up and running, kdump memory still set to 128 MB.

 

 

Starting out with Oracle Linux

I have been wanting to learn more about Linux for some time now, and the time has come to transform a desire to action. I have landed on Oracle Linux, for reasons that will soon become clear. As I don’t have any machine to dedicate to this, and because I want to have the ability to take snapshots and revert to previous states, I will be running Linux as a virtual machine.

For a while to come, you will see posts like this, all stored in the Linux101 Category, where I chronicle the specific challenges, annoyances and problems I run into, as well as their solutions, when I find them.

I have some experience in running virtual machines, and prefer using VirtualBox for such pursuits, because it is intuitive to set up, and is free to use (that is free as in freedom, AND as in beer). All Oracle software resources are distributed from the Oracle Technology Network, OTN. Using it is free, though I gather some resources are not available (more on this later, when and if I find out more).

In order to install Oracle Linux, you naturally need to download the installation images, distributed as a .iso disk image. To install Oracle Linux, you only need a single file. As I am running Oracle Linux 6, Update 5, this file is V41362-01.iso. There are four other iso-files available from OTN. At this point, we do not need those.

For future reference, here’s how I set my environment up:

  • Download the installer from OTN
  • In VirtualBox, create a new virtual machine, setting it up as an Linux, Oracle (64 bit) system
  • I gave it 1600 MB RAM, and a 55 GB Dynamically Allocated VDI Harddrive
  • Booting the virtual machine, I installed the OS from the downloaded installer
  • Going through the installer, I kept options at default, setting location, keyboard and root password as appropriate
  • Wanting to have a GUI, I opted for a Desktop install

Having done all that, you need to wait for a while; the installation procedure takes anywhere from three to fifteen minutes. Once the VM has rebooted, you need to create a local user account (in addition to root), and that’s it, you’re all set!

The importance of CSI

CSI – that’s Continual Service Improvement, by the way, not Crime Scene Investigation – is, to my mind, the single most important stage in the ITIL service life cycle. It evaluates what has gone before, identifies areas for improvement, and aids in the implementation of improving. In an ideal situation, CSI informs all the other stages, and is the main driver for the service life cycle.

The basic premise and assumption made, is that there is always room for improvement, revision and change. By tightly controlling it, using metrics, KPIs (Key Performance Indicators) and CSFs (Critical Success Factors), we move ourselves, our business and our customers onward to better reliability and service.

A recent article in the Norwegian press dealt with the retirement from professional shooting of the premier Norwegian Skeet shooter, and London 2012 Silver Medalist, who used the opportunity to send some pretty scathing criticism in the direction of the leadership for the National team, saying that the manager was “the biggest amateur of them all”. In response to this, the manager of the national shooting team commented:

Following the evaluation of the last season, the most successful in several years, the shooters’ association found no reason to make any changes. That can’t be amateurish.

Now, the premise that the manager lays down is that, because they were unable to find any areas for improvement, they are not amateurs. My contention would be that they a) are amateurs and b) haven’t been looking hard enough. Sure, the current premier shooters may be as good as they will ever get, but that doesn’t mean that nothing can be changed or improved.

The arrogance of saying “we can’t find anything to improve, so we must be professionals” is staggering, and indicates to me that he is, indeed an amateur. Imagine if the foremost athletes in the world, on winning an Olympic gold medal, were to say “well that’s that, then, I have no room for improvement now”, how would the world of sports look?

No, we must define metrics, measure them and compare them to KPIs and CSFs. That is the way forward, onward and upward. Following the national team manager’s lead means that we can only ever be as good as we are, and will likely become worse.

List user home folder size

Users tend to store all kinds of crud on their network home folders, which can be a constant source of frustration for SysAdmins. Luckily, it is fairly easy to get a list of the size of each folder, using a Powershell script. The script has already been made for us. It is discussed in detail here, and can be downloaded from here. You could do this from your local computer, or by remote desktop to the remote computer. The procedure is the same.

      1. Open File Explorer
      2. Navigate to the root folder for home folders
      3. Copy the script to that folder
      4. Start Powershell using an account that has the necessary (typically an admin account)
      5. In Powershell, Navigate to the root folder for home folders
      6. Run the script, using this command: .\Get-DirStats.ps1 >> c:\users\XXXX\Desktop\List.txt

(Replace XXXX with your user folder)

The output is a list of all user folders, by name. To sort it, simply import it to Excel.

The folder size is listed in Bytes. To get around this, simply divide them all by 1073741824 (individually, of course).

Who protects you?

Since 2011, EFF, the Electronic Frontier Foundation,  has published an annual report, “Who has your back“, detailing how a number of different companies deal with government data requests. 2014 is no exception, and the report came out a few weeks ago.

New to the report are Adobe, the Internet Archive and Snapchat, to mention a few, and everyone on the list have been checked for updates since the last report. In general, most companies protect you fairly well, with Apple, Dropbox, Google and a few others receiving full marks.

The report is an interesting read, if you care about these things, and worth a gander.