Search Results

Search found 12367 results on 495 pages for 'disk io'.

Page 271/495 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • How To View and Write To System Log Files on Ubuntu

    - by Chris Hoffman
    Linux logs a large amount of events to the disk, where they’re mostly stored in the /var/log directory in plain text. Most log entries go through the system logging daemon, syslogd, and are written to the system log. Ubuntu includes a number of ways of viewing these logs, either graphically or from the command-line. You can also write your own log messages to the system log — particularly useful in scripts. How to Banish Duplicate Photos with VisiPic How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It?

    Read the article

  • Saving game data to server [on hold]

    - by Eugene Lim
    What's the best method to save the player's data to the server? Method to store the game saves Which one of the following method should I use ? Using a database structure(e.g.. mySQL) to store the game data as blobs? Using the server hard disk to store the saved game data as binary data files? Method to send saved game data to server What method should I use ? socketIO web socket a web-based scripting language to receive the game data as binary? for example, a php script to handle binary data and save it to file Meta-data I read that some games store saved game meta-data in database structures. What kind of meta data is useful to store?

    Read the article

  • Root filesystem check fails after power failure during installation

    - by Oo Nwoye
    During the "install" phase of the upgrade there was a power failure. After when starting up again the following errors are reported: init: udevtrigger main process (420) terminated with status 1 init: udevtrigger post-stop process (428) terminated with status 1 init: udevmonitor main process (419) killed by TERM signal The disk drive for / is not ready yet or not present Continue to wait; or press S to skip mounting or M for manual recovery Pressing M gives me the following message: Root filesystem check failed. A maintenance shell will now be started. CONTROL-D will terminate this shell and reboot the system.

    Read the article

  • Ubuntu Lagging even LXDE freezes

    - by Anas Ismail Khan
    Laptop, i3, Ram: 2GB. Using 14.04LTS... and it lags like hell. Even if i open more than 4 tabs in Chrome, it freezes, and often I have no choice but to restart and multi-tasking is kinda difficult and at times impossible. Now there's whole thing about Lubuntu and LXDE that are suposed to be super-fast.. installed LXDE.. mind, not lubuntu-desktop. just LXDE. And it too freezes every now and then, and trust this.. when it freezes, it does so worse than Unity.. ESPECIALLY when i start PCManFM... and mount a disk or two... Any ideas as to why this is happening.. The minimum requirements for Unity are supposed to be 1Gig RAM.. and people are running it fine even on 512 MB...

    Read the article

  • A glitch after Ubuntu Installation. Cannot boot Ubuntu

    - by Starx
    I am trying to create a dual boot of Ubuntu 10.10 with Windows 7. My hard disk allocation were as follows: Windows 7 NTFS 100 GB /boot EXT4 200 MB SWAP LINUX SWAP 4 GB / EXT4 46 GB After installation is complete, instead of getting the boot screen of Ubuntu, it directly boots from windows 7 without asking anything. What is wrong? I run the Live Cd again using USB drive and I see that the \boot, and \ are occupied with most likely Ubuntu data. Now How do i point my Laptop to point to Ubuntu Boot instead of Windows Boot

    Read the article

  • The clock hands of the buffer cache

    - by Tony Davis
    Over a leisurely beer at our local pub, the Waggon and Horses, Phil Factor was holding forth on the esoteric, but strangely poetic, language of SQL Server internals, riddled as it is with 'sleeping threads', 'stolen pages', and 'memory sweeps'. Generally, I remain immune to any twinge of interest in the bowels of SQL Server, reasoning that there are certain things that I don't and shouldn't need to know about SQL Server in order to use it successfully. Suddenly, however, my attention was grabbed by his mention of the 'clock hands of the buffer cache'. Back at the office, I succumbed to a moment of weakness and opened up Google. He wasn't lying. SQL Server maintains various memory buffers, or caches. For example, the plan cache stores recently-used execution plans. The data cache in the buffer pool stores frequently-used pages, ensuring that they may be read from memory rather than via expensive physical disk reads. These memory stores are classic LRU (Least Recently Updated) buffers, meaning that, for example, the least frequently used pages in the data cache become candidates for eviction (after first writing the page to disk if it has changed since being read into the cache). SQL Server clearly needs some mechanism to track which pages are candidates for being cleared out of a given cache, when it is getting too large, and it is this mechanism that is somewhat more labyrinthine than I previously imagined. Each page that is loaded into the cache has a counter, a miniature "wristwatch", which records how recently it was last used. This wristwatch gets reset to "present time", each time a page gets updated and then as the page 'ages' it clicks down towards zero, at which point the page can be removed from the cache. But what is SQL Server is suffering memory pressure and urgently needs to free up more space than is represented by zero-counter pages (or plans etc.)? This is where our 'clock hands' come in. Each cache has associated with it a "memory clock". Like most conventional clocks, it has two hands; one "external" clock hand, and one "internal". Slava Oks is very particular in stressing that these names have "nothing to do with the equivalent types of memory pressure". He's right, but the names do, in that peculiar Microsoft tradition, seem designed to confuse. The hands do relate to memory pressure; the cache "eviction policy" is determined by both global and local memory pressures on SQL Server. The "external" clock hand responds to global memory pressure, in other words pressure on SQL Server to reduce the size of its memory caches as a whole. Global memory pressure – which just to confuse things further seems sometimes to be referred to as physical memory pressure – can be either external (from the OS) or internal (from the process itself, e.g. due to limited virtual address space). The internal clock hand responds to local memory pressure, in other words the need to reduce the size of a single, specific cache. So, for example, if a particular cache, such as the plan cache, reaches a defined "pressure limit" the internal clock hand will start to turn and a memory sweep will be performed on that cache in order to remove plans from the memory store. During each sweep of the hands, the usage counter on the cache entry is reduced in value, effectively moving its "last used" time to further in the past (in effect, setting back the wrist watch on the page a couple of hours) and increasing the likelihood that it can be aged out of the cache. There is even a special Dynamic Management View, sys.dm_os_memory_cache_clock_hands, which allows you to interrogate the passage of the clock hands. Frequently turning hands equates to excessive memory pressure, which will lead to performance problems. Two hours later, I emerged from this rather frightening journey into the heart of SQL Server memory management, fascinated but still unsure if I'd learned anything that I'd put to any practical use. However, I certainly began to agree that there is something almost Tolkeinian in the language of the deep recesses of SQL Server. Cheers, Tony.

    Read the article

  • ubuntu 11.10 install 4.4.3-0ubuntu2 package dependencies

    - by HuangheWoo
    before sudo apt-get install gnuplot I sudo apt-get build-dep gnuplot to resolve package dependencies. ~$ sudo apt-get build-dep gnuplot Reading package lists... Done Building dependency tree Reading state information... Done Note, selecting 'liblua5.1-0-dev' instead of 'liblua5.1-dev' The following packages will be REMOVED: libgd2-xpm ubuntu-desktop The following NEW packages will be installed: debhelper diffstat html2text intltool-debian libbsd-dev libcairo-script-interpreter2 libcairo2-dev libedit-dev libexpat1-dev libfontconfig1-dev libfreetype6-dev libgd2-noxpm libgd2-noxpm-dev libglib2.0-dev libjpeg62-dev liblua5.1-0-dev libncurses5-dev libpango1.0-dev libpixman-1-dev libpng12-dev libreadline-dev libreadline6-dev libtinfo-dev libwxbase2.8-dev libwxgtk2.8-dev libxcb-render0-dev libxcb-shm0-dev libxft-dev libxrender-dev po-debconf quilt texinfo wx2.8-headers x11proto-render-dev 0 upgraded, 34 newly installed, 2 to remove and 0 not upgraded. Need to get 9,100 kB of archives. After this operation, 37.8 MB of additional disk space will be used. It says the "ubuntu-desktop" will be removed, but "ubuntu-desktop" is important. What's should I do?

    Read the article

  • Apple II Teardown and Restoration Offers a Peek at Computing History [Video]

    - by Jason Fitzpatrick
    In this extended teardown video, we’re granted a peek at the guts of an Apple IIe and treated to quite a bit of Apple IIe history in the process. Todd Harrison, via his project blog ToddFun, shares videos of his Apple IIe restoration project. The videos are lengthy, but include close up examination of all the parts and lots of information about the history of the computer and its construction. You can check out the rest of his Apple II videos and posts at the link below. Apple II Plus from 1982 teardown, repair, cleanup and demonstration [via The Unofficial Apple Weblog] HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • How do I list installed software with the installed size?

    - by Lewis Goddard
    I would like to have a list the installed software on my machine, with the disk space consumed by them alongside. I would prefer to be able to order by largest/smallest, but that is not a necessity. I am the sort of person who will install software to try it, and never clean up after myself. As a result, my 7GB (Windows and my Data are on separate partitions, as well as a swap area) Ubuntu 11.04 partition is suffering, and has started regularly showing warning messages. I have cleaned my browser cache, as well as everything under Package Cleaner in Ubuntu Tweak, and am left with 149.81 MB off free space.

    Read the article

  • Have I fixed my partition problem with os x 10.5.8? Are my GPT and MBR back to normal?

    - by David Schaap
    I'm new to linux and I have overstepped by abilities. I tried dual booting os x 10.5.8 with ubuntu 11.10 with rEFIt, but I been having problems with partitioning. Instead of enduring more headaches, I've made the decision to simply use ubuntu on VirtualBox. I've tried to return my HDD to normal, but I am looking for confirmation that my partitions are ok. Here is the report from partition inspector: *** Report for internal hard disk *** Current GPT partition table: # Start LBA End LBA Type 1 409640 233917359 Mac OS X HFS+ Current MBR partition table: # A Start LBA End LBA Type 1 1 234441647 ee EFI Protective MBR contents: Boot Code: GRUB Partition at LBA 409640: Boot Code: None File System: HFS Extended (HFS+) Listed in GPT as partition 1, type Mac OS X HFS+ Also, my HDD directory has a bunch of extra folders in them and they appear to be ubuntu related, although it is no longer installed. folders like bin, sbin, cores, var, user, and so on. Those folders aren't supposed to be there, right? Thanks in advance.

    Read the article

  • Soda Cans Exploding Under the Stress of High Voltage [Video]

    - by Jason Fitzpatrick
    In an effort to start your Monday off in true Mad Scientist style, we bring you soda cans being decimated by thousands of volts in a “Thumper”. What is a thumper, you ask? During office hours, it’s a high-voltage testing unit most often used to stress test electric cables. In the off hours, however, the electrical engineering geeks over at The Geek Group like to shove anywhere from a few hundred to thousands of volts through unsuspecting objects to see what happens. In this installation they’re shooting high voltage through a variety of soft drink cans with an end result that sounds and looks like a cannon loaded with Mountain Dew. [via Hacked Gadgets] HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • How to Create a Portable Version of RocketDock for a USB Flash Drive

    - by Lori Kaufman
    RocketDock is a lightweight, highly customizable application launcher, or dock, for Windows. You can install it on your computer or use a portable version on a USB flash drive to provide quick access to your portable programs. We’ll show you how to make RocketDock portable. However, first you must install RocketDock before making it portable. See our article about installing, setting up, and using RocketDock. Once you have installed RocketDock, right-click anywhere on the dock or on the icons on the dock and select Dock Settings from the popup menu. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Import images from camera in KDE with particular directory structure

    - by Sergey
    I have been using f-spot for a few years to manage my photo archive, which is about 50K images at the moment. With the development of f-spot slowing down in the recent years and me switching to KDE, I'm looking at using DigiKam, which seems to be very nice and packed with features beyond my wildest hopes :) One thing I'm missing though is the way f-spot was importing the images: it was creating subdirectories based on the image's shooting date: $HOME/Photos/2011/11/12/IMG_1234.jpg $HOME/Photos/2011/11/13/IMG_1235.jpg $HOME/Photos/2011/11/13/IMG_1236.jpg I don't seem to be able to find a way to make DigiKam to behave like this - although it has some settings to change the image filename according to some mask which may include shooting date, I see now way to tell it to create sub-directories. Is there a way to make DigiKam to behave like this? Or, alternatively, what is a good program to import images from a camera and save them on disk in subdirectories according to their shooting date?

    Read the article

  • Just updated, after reboot my computer won't start up again

    - by Alex
    I have a macbook that I use on occasion which dual boots Ubuntu and OSX (It has rEFIt installed). I turned it on for the first time in a while and it needed a bunch of updates. So I let it run, and restarted it when it asked. When it was booting up, it got stuck at a light blue screen. There was nothing on the screen to indicate that it was doing anything - I figured it just got stuck or something, so I turned it off and back on. (I suspect now it was actually working, but I had no indication that it hadn't just frozen) Now I can't access either OSX or my Ubuntu partition. When I choose ubuntu on the rEFIt menu, it shows "No bootable device -- insert book disk and press key". If I try to start up OSX is looks like it starts loading, but instead of an apple logo there's a crossed out circle icon.

    Read the article

  • Ubuntu suddenly won't boot on a Mac

    - by emchristiansen
    I installed 11.10 in dual-boot mode following instructions here: https://help.ubuntu.com/community/MactelSupportTeam/AppleIntelInstallation Everything worked fine, until I recently updated from Mac OS 10.6.x to the latest 10.6.x (this was a minor update prompted by OS X). The update made the rEFIt screen disappear, so I ran Boot Repair and reinstalled rEFIt and everything worked. I accidentally left my computer without power while booted into Ubuntu, until it presumably died or hibernated itself. I have been unable to boot into Ubuntu since. I didn't see the GRUB screen when I selected Linux the rEFIt chooser. Then I reinstalled rEFIt and the Linux option disappeared from the rEFIt chooser. This is a link to the boot info collected by Boot Repair: http://paste.ubuntu.com/755543/ Any help would be appreciated, especially an explanation of what all these components are (EFI, MBR, GPT, GRUB), where they live on disk, how the system knows to find each component, and how they relate to each other. Thanks!

    Read the article

  • VirtualBox host: Ubuntu vs. Windows XP

    - by iambriansreed
    In order to lengthen the lifespan of my machine I am replacing the weakest link, the hard drive and installing a new OS. I had planned on using xp pro as my virtualbox host and ubuntu as guest. After messing with ubuntu desktop and server I am really impressed and am thinking of reversing the virtualbox setup; ubuntu host xp guest. I would use XP for Adobe Fireworks, Netflix, and iTunes (maybe) that's pretty much it. Any reason not to do ubuntu host with xp guest? I know the xp vbox will run slower as a guest but really how much slower? It's a desktop. 4gb ram, 500gb disk, Pent D 3.2 ghz

    Read the article

  • What is the best solution for document archiving?

    - by Anders Wallenquist
    I'm looking for a utility that helps me (and my colleagues) to archive documents in a systematic manner (Like Zeitgeist but permanent). The utility have to clean-out old document from desktops and store them on a server (as automatic as possible and consistent) maybe from just a few locations (Document directory) Documents shall be stored on cheap large media for many years to come - hard disk and file system maybe? Easy to maintain and manage for a small organization. Documents have to be easy to find and restore One systematic manner could be a directory-structure by year, month, user or user, year, month. Its a plus if documents could be linked to a project, if documents could be search-able and if document could also be mail, IM-discussions not only OpenOffice traditional documents. Any ideas?

    Read the article

  • RESIZE casper-rw

    - by Oldrifle
    UsING Toporesize-0.7.1 FLash drive is Transcent I want to add all unused space to casper-rw. Disk=3.72GB 2.68 used 1.GB casper-rw=1.95GB caspe=681MG I boot UBUNTU 11.10 64bit and see that size of HOME is about half of casper-rw It is working from flash drive ok. But I was not able to boot 11.10 64bit using USB 3.0 HDD. UBUNTU 11.10 32 bit on usb 2.0 HDD works ok ( currently multiboot with openSUSE) FUDUNTU 2012 on usb 3.0 hdd VERY FAST. Fast USB 3.0 8GB unit is ready. partitioned. Second part is labeled as casper-rw. Will try to install using partitions as base and home.No swap since I have 8GB ram. ANY SUGGESTION? Thanks

    Read the article

  • Will an Atheros AR928X work with WPA2?

    - by Tommy
    Basically I need only the answer to above question. Please think of that I am new to linux. For further explanation here is the full story: I have the following problem. My friends notebook (Vista) has got a trojan and refuses to work anymore. The Avira Rescue CD did not help either. So I tried an old (9.1) Ubuntu CD and backed up all the essential files. Since we have no Windows Install Disk we want to put Ubuntu on that notebook. But with the 9.1 version there is no WLAN. Systemtest tells me, that it finds an Atheros AR928X, but ifconfig does not show that and the network manager tells me there are no LAN/WLAN devices. So: does that work easier with the new Ubuntu version or is that network adapter a known troublemaker? And: if I get the adapter to work, will it work with the WPA2-network around here?

    Read the article

  • How to Encrypt Your Home Folder After Installing Ubuntu

    - by Chris Hoffman
    Ubuntu offers to encrypt your home folder during installation. If you decline the encryption and change your mind later, you don’t have to reinstall Ubuntu. You can activate the encryption with a few terminal commands. Ubuntu uses eCryptfs for encryption. When you log in, your home directory is automatically decrypted with your password. While there is a performance penalty to encryption, it can keep private data confidential, particularly on laptops that may be stolen. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Given an XML which contains a representation of a graph, how to apply it DFS algorithm? [on hold]

    - by winston smith
    Given the followin XML which is a directed graph: <?xml version="1.0" encoding="iso-8859-1" ?> <!DOCTYPE graph PUBLIC "-//FC//DTD red//EN" "../dtd/graph.dtd"> <graph direct="1"> <vertex label="V0"/> <vertex label="V1"/> <vertex label="V2"/> <vertex label="V3"/> <vertex label="V4"/> <vertex label="V5"/> <edge source="V0" target="V1" weight="1"/> <edge source="V0" target="V4" weight="1"/> <edge source="V5" target="V2" weight="1"/> <edge source="V5" target="V4" weight="1"/> <edge source="V1" target="V2" weight="1"/> <edge source="V1" target="V3" weight="1"/> <edge source="V1" target="V4" weight="1"/> <edge source="V2" target="V3" weight="1"/> </graph> With this classes i parsed the graph and give it an adjacency list representation: import java.io.IOException; import java.util.HashSet; import java.util.LinkedList; import java.util.Collection; import java.util.Iterator; import java.util.logging.Level; import java.util.logging.Logger; import practica3.util.Disc; public class ParsingXML { public static void main(String[] args) { try { // TODO code application logic here Collection<Vertex> sources = new HashSet<Vertex>(); LinkedList<String> lines = Disc.readFile("xml/directed.xml"); for (String lin : lines) { int i = Disc.find(lin, "source=\""); String data = ""; if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } Vertex v = new Vertex(); v.setName(data); v.setAdy(new HashSet<Vertex>()); sources.add(v); } } Iterator it = sources.iterator(); while (it.hasNext()) { Vertex ver = (Vertex) it.next(); Collection<Vertex> adyacencias = ver.getAdy(); LinkedList<String> ls = Disc.readFile("xml/graphs.xml"); for (String lin : ls) { int i = Disc.find(lin, "target=\""); String data = ""; if (lin.contains("source=\""+ver.getName())) { Vertex v = new Vertex(); if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } v.setName(data); } i = Disc.find(lin, "weight=\""); data = ""; if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } v.setWeight(Integer.parseInt(data)); } if (v.getName() != null) { adyacencias.add(v); } } } } for (Vertex vert : sources) { System.out.println(vert); System.out.println("adyacencias: " + vert.getAdy()); } } catch (IOException ex) { Logger.getLogger(ParsingXML.class.getName()).log(Level.SEVERE, null, ex); } } } This is another class: import java.util.Collection; import java.util.Objects; public class Vertex { private String name; private int weight; private Collection ady; public Collection getAdy() { return ady; } public void setAdy(Collection adyacencias) { this.ady = adyacencias; } public String getName() { return name; } public void setName(String nombre) { this.name = nombre; } public int getWeight() { return weight; } public void setWeight(int weight) { this.weight = weight; } @Override public int hashCode() { int hash = 7; hash = 43 * hash + Objects.hashCode(this.name); hash = 43 * hash + this.weight; return hash; } @Override public boolean equals(Object obj) { if (obj == null) { return false; } if (getClass() != obj.getClass()) { return false; } final Vertex other = (Vertex) obj; if (!Objects.equals(this.name, other.name)) { return false; } if (this.weight != other.weight) { return false; } return true; } @Override public String toString() { return "Vertice{" + "name=" + name + ", weight=" + weight + '}'; } } And finally: /** * * @author user */ /* -*-jde-*- */ /* <Disc.java> Contains the main argument*/ import java.io.*; import java.util.LinkedList; /** * Lectura y escritura de archivos en listas de cadenas * Ideal para el uso de las clases para gráficas. * * @author Peralta Santa Anna Victor Miguel * @since Julio 2011 */ public class Disc { /** * Metodo para lectura de un archivo * * @param fileName archivo que se va a leer * @return El archivo en representacion de lista de cadenas */ public static LinkedList<String> readFile(String fileName) throws IOException { BufferedReader file = new BufferedReader(new FileReader(fileName)); LinkedList<String> textlist = new LinkedList<String>(); while (file.ready()) { textlist.add(file.readLine().trim()); } file.close(); /* for(String linea:textlist){ if(linea.contains("source")){ //String generado = linea.replaceAll("<\\w+\\s+\"", ""); //System.out.println(generado); } }*/ return textlist; }//readFile public static int find(String linea,String palabra){ int i,j; boolean found = false; for(i=0,j=0;i<linea.length();i++){ if(linea.charAt(i)==palabra.charAt(j)){ j++; if(j==palabra.length()){ found = true; return i; } }else{ continue; } } if(!found){ i= -1; } return i; } /** * Metodo para la escritura de un archivo * * @param fileName archivo que se va a escribir * @param tofile la lista de cadenas que quedaran en el archivo * @param append el bit que dira si se anexa el contenido o se empieza de cero */ public static void writeFile(String fileName, LinkedList<String> tofile, boolean append) throws IOException { FileWriter file = new FileWriter(fileName, append); for (int i = 0; i < tofile.size(); i++) { file.write(tofile.get(i) + "\n"); } file.close(); }//writeFile /** * Metodo para escritura de un archivo * @param msg archivo que se va a escribir * @param tofile la cadena que quedaran en el archivo * @param append el bit que dira si se anexa el contenido o se empieza de cero */ public static void writeFile(String msg, String tofile, boolean append) throws IOException { FileWriter file = new FileWriter(msg, append); file.write(tofile); file.close(); }//writeFile }// I'm stuck on what can be the best way to given an adjacency list representation of the graph how to apply it Depth-first search algorithm. Any idea of how to aproach to complete the task?

    Read the article

  • Post Deploy MAAS cleanup

    - by David Buttrick
    I have a mostly working MAAS cluster. I'm still learning juju, but while I'm doing that, I wanted to take this opportunity to do some clean-up tasks. Here are my goals: Configure ntp on the nodes. Set the video mode on the nodes. Set the timezone on the nodes. Are these juju tasks? Or is this better attacked by mounting the disk image on the MAAS host, and doing the configuration there? If I do it that way, how do I get the nodes to recognize that they have to re-install the image to pickup my changes? Thank you. David

    Read the article

  • Creating a Training Lab on Windows Azure

    - by Michael Stephenson
    Originally posted on: http://geekswithblogs.net/michaelstephenson/archive/2013/06/17/153149.aspxThis week we are preparing for a training course that Alan Smith will be running for the support teams at one of my customers around Windows Azure. In order to facilitate the training lab we have a few prerequisites we need to handle. One of the biggest ones is that although the support team all have MSDN accounts the local desktops they work on are not ideal for running most of the labs as we want to give them some additional developer background training around Azure. Some recent Azure announcements really help us in this area: MSDN software can now be used on Azure VM You don't pay for Azure VM's when they are no longer used  Since the support team only have limited experience of Windows Azure and the organisation also have an Enterprise Agreement we decided it would be best value for money to spin up a training lab in a subscription on the EA and then we can turn the machines off when we are done. At the same time we would be able to spin them back up when the users need to do some additional lab work once the training course is completed. In order to achieve this I wanted to create a powershell script which would setup my training lab. The aim was to create 18 VM's which would be based on a prebuilt template with Visual Studio and the Azure development tools. The script I used is described below The Start & Variables The below text will setup the powershell environment and some variables which I will use elsewhere in the script. It will also import the Azure Powershell cmdlets. You can see below that I will need to download my publisher settings file and know some details from my Azure account. At this point I will assume you have a basic understanding of Azure & Powershell so already know how to do this. Set-ExecutionPolicy Unrestrictedcls $startTime = get-dateImport-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"# Azure Publisher Settings $azurePublisherSettings = '<Your settings file>.publishsettings'  # Subscription Details $subscriptionName = "<Your subscription name>" $defaultStorageAccount = "<Your default storage account>"  # Affinity Group Details $affinityGroup = '<Your affinity group>' $dataCenter = 'West Europe' # From Get-AzureLocation  # VM Details $baseVMName = 'TRN' $adminUserName = '<Your admin username>' $password = '<Your admin password>' $size = 'Medium' $vmTemplate = '<The name of your VM template image>' $rdpFilePath = '<File path to save RDP files to>' $machineSettingsPath = '<File path to save machine info to>'    Functions In the next section of the script I have some functions which are used to perform certain actions. The first is called CreateVM. This will do the following actions: If the VM already exists it will be deleted Create the cloud service Create the VM from the template I have created Add an endpoint so we can RDP to them all over the same port Download the RDP file so there is a short cut the trainees can easily access the machine via Write settings for the machine to a log file  function CreateVM($machineNo) { # Specify a name for the new VM $machineName = "$baseVMName-$machineNo" Write-Host "Creating VM: $machineName"       # Get the Azure VM Image      $myImage = Get-AzureVMImage $vmTemplate   #If the VM already exists delete and re-create it $existingVm = Get-AzureVM -Name $machineName -ServiceName $serviceName if($existingVm -ne $null) { Write-Host "VM already exists so deleting it" Remove-AzureVM -Name $machineName -ServiceName $serviceName }   "Creating Service" $serviceName = "bupa-azure-train-$machineName" Remove-AzureService -Force -ServiceName $serviceName New-AzureService -Location $dataCenter -ServiceName $serviceName   Write-Host "Creating VM: $machineName" New-AzureQuickVM -Windows -name $machineName -ServiceName $serviceName -ImageName $myImage.ImageName -InstanceSize $size -AdminUsername $adminUserName -Password $password  Write-Host "Updating the RDP endpoint for $machineName" Get-AzureVM -name $machineName -ServiceName $serviceName ` | Add-AzureEndpoint -Name RDP -Protocol TCP -LocalPort 3389 -PublicPort 550 ` | Update-AzureVM    Write-Host "Get the RDP File for machine $machineName" $machineRDPFilePath = "$rdpFilePath\$machineName.rdp" Get-AzureRemoteDesktopFile -name $machineName -ServiceName $serviceName -LocalPath "$machineRDPFilePath"   WriteMachineSettings "$machineName" "$serviceName" }    The delete machine settings function is used to delete the log file before we start re-running the process.  function DeleteMachineSettings() { Write-Host "Deleting the machine settings output file" [System.IO.File]::Delete("$machineSettingsPath"); }    The write machine settings function will get the VM and then record its details to the log file. The importance of the log file is that I can easily provide the information for all of the VM's to our infrastructure team to be able to configure access to all of the VM's    function WriteMachineSettings([string]$vmName, [string]$vmServiceName) { Write-Host "Writing to the machine settings output file"   $vm = Get-AzureVM -name $vmName -ServiceName $vmServiceName $vmEndpoint = Get-AzureEndpoint -VM $vm -Name RDP   $sb = new-object System.Text.StringBuilder $sb.Append("Service Name: "); $sb.Append($vm.ServiceName); $sb.Append(", "); $sb.Append("VM: "); $sb.Append($vm.Name); $sb.Append(", "); $sb.Append("RDP Public Port: "); $sb.Append($vmEndpoint.Port); $sb.Append(", "); $sb.Append("Public DNS: "); $sb.Append($vmEndpoint.Vip); $sb.AppendLine(""); [System.IO.File]::AppendAllText($machineSettingsPath, $sb.ToString());  } # end functions    Rest of Script In the rest of the script it is really just the bit that orchestrates the actions we want to happen. It will load the publisher settings, select the Azure subscription and then loop around the CreateVM function and create 16 VM's  Import-AzurePublishSettingsFile $azurePublisherSettings Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccount $defaultStorageAccount Select-AzureSubscription -SubscriptionName $subscriptionName  DeleteMachineSettings    "Starting creating Bupa International Azure Training Lab" $numberOfVMs = 16  for ($index=1; $index -le $numberOfVMs; $index++) { $vmNo = "$index" CreateVM($vmNo); }    "Finished creating Bupa International Azure Training Lab" # Give it a Minute Start-Sleep -s 60  $endTime = get-date "Script run time " + ($endTime - $startTime)    Conclusion As you can see there is nothing too fancy about this script but in our case of creating a small isolated training lab which is not connected to our corporate network then we can easily use this to provision the lab. Im sure if this is of use to anyone you can easily modify it to do other things with the lab environment too. A couple of points to note are that there are some soft limits in Azure about the number of cores and services your subscription can use. You may need to contact the Azure support team to be able to increase this limit. In terms of the real business value of this approach, it was not possible to use the existing desktops to do the training on, and getting some internal virtual machines would have been relatively expensive and time consuming for our ops team to do. With the Azure option we are able to spin these machines up for a temporary period during the training course and then throw them away when we are done. We expect the costing of this test lab to be very small, especially considering we have EA pricing. As a ball park I think my 18 lab VM training environment will cost in the region of $80 per day on our EA. This is a fraction of the cost of the creation of a single VM on premise.

    Read the article

  • SQL Server 2012 AlwaysOn Groups and FCIs Part 4

    This is Part 4 of a series on AlwaysOn and FCI integration in SQL Server. In this article we will learn how to add the iSCSI disk storage to our SQL Server nodes and build the cluster. 24% of devs don’t use database source control – make sure you aren’t one of themVersion control is standard for application code, but databases haven’t caught up. So what steps can you take to put your SQL databases under version control? Why should you start doing it? Read more to find out…

    Read the article

  • Install Ubuntu and erase Windows Vista

    - by miguel
    I have an older laptop with a ADA hard disk I can't really buy a new one so I want to erase Windows Vista on my computer and only have Ubuntu so that I can have more space. How do I make it go directly to my blank CD? My Windows Vista is messed up and I can't even get into it. I want to download the new version of Ubuntu while in Ubuntu. I downloaded it but it didn't go directly to the blank CD. I tried to copy all of Ubuntu onto the CD once it was downloaded but it says there was an error while copying. What should I do?

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >