Search Results

Search found 14041 results on 562 pages for 'home surveillance'.

Page 161/562 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • Upload to PPA succeeded but packages doesn't appear

    - by lorin
    I'm trying to upload packages to my PPA for the first time. I want to use the PPA for customized versions of the OpenStack Compute (nova) project, so I tried to do a test by uploading packages corresponding to the bexar release of this project (lp:nova/bexar), with a new version number and changelog entry. I signed the source packages using my OpenGPG key, which has been uploaded to the ubuntu keyserver: $ dch -v 2011.1-0ubuntu2-isi1 -D lucid "ISI bexar build #1" $ dpkg-buildpackage -s -rfakeroot -tc -D -k4C8A14AB When I tried to upload the files to the repository, it seemed to work (real email obscured): $ dput ppa:lorinh/ppa nova_2011.2~bzr663-1isi1_source.changes Checking signature on .changes gpg: Signature made Fri 11 Feb 2011 03:52:50 PM EST using RSA key ID 4C8A14AB gpg: Good signature from "Lorin Hochstein <lorin@...>" Good signature on /home/lorin/packaging/nova_2011.2~bzr663-1isi1_source.changes. Checking signature on .dsc gpg: Signature made Fri 11 Feb 2011 03:52:44 PM EST using RSA key ID 4C8A14AB gpg: Good signature from "Lorin Hochstein <lorin@...>" Good signature on /home/lorin/packaging/nova_2011.2~bzr663-1isi1.dsc. Uploading to ppa (via ftp to ppa.launchpad.net): Uploading nova_2011.2~bzr663-1isi1.dsc: done. Uploading nova_2011.2~bzr663-1isi1.tar.gz: done. Uploading nova_2011.2~bzr663-1isi1_source.changes: done. However, the packages aren't listed on my PPA page. If I try to upload again, I get the error: $ dput ppa:lorinh/ppa nova_2011.2~bzr663-1isi1_source.changes Package has already been uploaded to ppa on ppa.launchpad.net Nothing more to do for nova_2011.2~bzr663-1isi1_source.changes Am I supposed to do something next? How do I track down what wrong? As of this writing, it's been a day and a half since I've done the upload.

    Read the article

  • OpenSSH SFTP server with chroot()

    - by HannesFostie
    I am currently setting up an SFTP server but there is one detail I can't seem to figure out. When I add a user, I would like him to connect using his client and be able to write in his "root dir" right away. My Match case for the SFTP-users group currently has ChrootDirectory set as "/home/%u", and inside that directory I have to have a subdirectory owned by the user, while /home/%u itself is owned by root. Next to that, the "root dir" also has a couple files, .bashrc to name one. Is it possible to put these files somewhere else, remove them, or at least make them invisible to the user? Thanks

    Read the article

  • Migrating a Windows Server to Ubuntu Server to provide Samba, AFP and Roaming Profiles

    - by Dan
    I'm replacing our old Windows XP Pro office server with a HP Microserver running Ubuntu Server 12.04 LTS. I'm not a Linux expert but I can find my way around a terminal prompt, I'm a Mac user by choice. The office use a mix of Windows XP Pro machines and OSX Lion laptops. I included Samba during installation, and I'm planning on using Netatalk for the AFP and Bonjour sharing. I'd quite like to have samba make the server appear in 'My network places' on the Windows machines the way Bonjour makes it appear in finder on the Macs, if this is possible? I want to get to a point so that a user logging into Windows, gets connected to the Ubuntu server (do they need an Ubuntu user account?) which get them their shares and their Windows user profile (though a standard profile across users would do). The upshot is to make centralised control of user accounts (e.g. If a person leaves, killing their account on the server stops their Windows logon and ability to access Samba shares) and to ensure files aren't stored on the individual machines for backup/security purposes. I want to make this as simple as possible, so don't want to have loads of stuff I don't need, I just can't figure out: What I need at the server end: - will Samba be enough (already installed as part of initial installation), or will I need to cock around with LDAP (and how does this interact with Samba) - For someone of moderate Linux competence like me, is there a package that offers easy admin of user accounts, e.g. a GUI like phpLDAPadmin (if LDAP is necessary) How to configure the XP machines: - do I need to have the XP machines set up as a domain controller (I've no idea, really) - roaming profiles looks to offer the feature of putting the user's files on the server rather than the machine itself along with a profile that follows the user from machine to machine. Syncing Mac user's home folders with the server This is less of a concern because I can set up Time Machine if it comes to it, but I'd appreciate any recommendations of what approach I should take having the Mac home folders synced to the server.

    Read the article

  • Apache2 Doesn't Serve Subdomain Alias

    - by Cyle Hunter
    I'm trying to prefix an existing Rails application with a sub-domain, essentially I want the sub-domain to serve the same application. Right now apache2 serves my application with "www.example.com" or "example.com". I adjusted my sites-available virtualhost in hopes of allowing for "foo.example.com" or "www.foo.example.com" however both instances are met with a domain not found error. Here is my current VirtualHost in /etc/apache2/sites-available/example.com: <VirtualHost *:80> ServerName example.com ServerAlias foo.example.com *.example.com www.foo.example.com www.example.com DocumentRoot /home/user/my_app/public <Directory /home/user/my_app/public> AllowOverride all Options -MultiViews </Directory> </VirtualHost> Any ideas? Note, I realized I probably don't need a wild card sub-domain for what I'm trying to do, I simply added that in as a last-ditch effort. Edit: The actual domain is virtualrobotgames.com with the desired subdomain being roboteer.virtualrobotgames.com

    Read the article

  • Best way to restore individual folders via Time Machine after clean Lion install?

    - by A4J
    I'm doing a clean erase and install of Lion, and am looking for the best way to restore individual folders into my home directory via Time Machine. I've done a dummy run, clean Lion install, then 'browse other disks' in Time Machine, navigate to my home folder and 'restore' what I need, such as pictures/music and folders inside the .library folder (such as Mail and Keychains). However this method seems to give you odd permissions, like this: http://i43.tinypic.com/15y82v4.png Hence I wondered if anyone knows what the best method is to restore files and folders after a clean install. N.b I do not want to use the migration assistant, or 'restore OS from Time Machine' - as I specifically want to do a clean install, and just copy over what I need (some folders will be moved onto a separate disk to the OS, and some will remain on the same disk). Thanks in advance.

    Read the article

  • rsync problems and security concerns

    - by MB.
    Hi I am attempting to use rsync to copy files between two linux servers. both on 10.04.4 I have set up the ssh and a script running under a cron job. this is the message i get back from the cron job. To: mark@ubuntu Subject: Cron ~/rsync.sh Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: X-Cron-Env: X-Cron-Env: X-Cron-Env: Message-Id: <20120708183802.E0D54FC2C0@ubuntu Date: Sun, 8 Jul 2012 14:38:01 -0400 (EDT) rsync: link_stat "/home/mark/#342#200#223rsh=ssh" failed: No such file or directory (2) rsync: opendir "/Library/WebServer/Documents/.cache" failed: Permission denied (13) rsync: recv_generator: mkdir "/Library/Library" failed: Permission denied (13) * Skipping any contents from this failed directory * rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7] Q.1 can anyone tell me why I get this message -- rsync: link_stat "/home/mark/#342#200#223rsh=ssh" failed: No such file or directory (2) the script is: #!/bin/bash SOURCEPATH='/Library' DESTPATH='/Library' DESTHOST='192.168.1.15' DESTUSER='mark' LOGFILE='rsync.log' echo $'\n\n' >> $LOGFILE rsync -av –rsh=ssh $SOURCEPATH $DESTUSER@$DESTHOST:$DESTPATH 2>&1 >> $LOGFILE echo “Completed at: `/bin/date`” >> $LOGFILE Q2. I know I have several problems with the permissions all of the files I am copying usually require me to use sudo to manipulate them. My question is then is there a way i can run this job without giving my user root access or using root in the login ?? Thanks for the help .

    Read the article

  • TypeError: Cannot call method 'hasOwnProperty' of null, while creating a QMLscene window

    - by tomoqv
    I am trying to make a simple Ubuntu Touch web application with Qt Creator. I have set up a new project according to the tutorial and committed the files to Bazaar. I have set a url instead of the default index.htm in the qml file of the project. Using build-run loads a QML Scene window with the desired webpage, but Qt Creator yields the following output: Starting /usr/lib/i386-linux-gnu/qt5/bin/qmlscene -I /home/tomas/ubuntu-sdk/SL-planner -I /usr/bin -I /usr/lib/i386-linux-gnu/qt5/qml /home/tomas/ubuntu-sdk/SL-planner/SL-planner.qml unity::action::ActionManager::ActionManager(QObject*): Could not determine application identifier. HUD will not work properly. Provide your application identifier in $APP_ID environment variable. file:///usr/lib/i386-linux-gnu/qt5/qml/Ubuntu/Components/MainView.qml:257: TypeError: Cannot call method 'hasOwnProperty' of null My SL-planner.qml looks like this: import QtQuick 2.0 import Ubuntu.Components 0.1 import QtWebKit 3.0 /*! \brief MainView with a Flickable WebView. */ MainView { // objectName for functional testing purposes (autopilot-qt5) objectName: "mainView" // Note! applicationName needs to match the "name" field of the click manifest applicationName: "com.ubuntu.developer.tomoqv.SL-planner" /* This property enables the application to change orientation when the device is rotated. The default is false. */ automaticOrientation: true width: units.gu(100) height: units.gu(75) Flickable { id: webViewFlickable anchors.fill: parent WebView { id: webView anchors.fill: parent url: "http://mobil.sl.se" } } } What am I missing?

    Read the article

  • maildir in Windows for IMAP

    - by User1
    I'm interested in accessing my IMAP accounts offline. I found that maildirs are a simple way to make it work. I found that [offlineimap] takes care of almost everything in making the IMAP-maildir sync happen. Then, I can open the account in Mutt or Wanderlust client. One major problem, maildirs use colons in their filenames. Windows doesn't allow colons. I tried mount -f -s -b -o managed "d:/tmp/mail" "/home/of/mail" in Cygwin, but doing an echo test > /home/of/mail/test:file didn't work I'm thinking about ext2fs, but I need an ext2 partition somewhere. Can I make a file into a partition somehow? I don't want to start modifying my hard drive's partition table. Besides, does anyone know if ext2fs will support colons in filenames?

    Read the article

  • chown: changing ownership of `.': Invalid argument

    - by Pierre
    I'm trying to install some new files on our new server while our sysadmin is in holidays: Here is my df # df -h Filesystem Size Used Avail Use% Mounted on /dev/sdb3 273G 11G 248G 5% / tmpfs 48G 260K 48G 1% /dev/shm /dev/sdb1 485M 187M 273M 41% /boot xxx.xx.xxx.xxx:/commun 63T 2.2T 61T 4% /commun as root , I can create a new directory and run chown under /home/lindenb # cd /home/lindenb/ # mkdir X # chown lindenb X but I cannot run the same command under /commun # cd /commun/data/users/lindenb/ # mkdir X # chown lindenb X chown: changing ownership of `X': Invalid argument why ? how can I fix this ? updated: mount: /dev/sdb3 on / type ext4 (rw) proc on /proc type proc (rw) sysfs on /sys type sysfs (rw) devpts on /dev/pts type devpts (rw,gid=5,mode=620) tmpfs on /dev/shm type tmpfs (rw) /dev/sdb1 on /boot type ext4 (rw) none on /proc/sys/fs/binfmt_misc type binfmt_misc (rw) sunrpc on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw) xxx.xx.xxx.xxx:/commun on /commun type nfs (rw,noatime,noac,hard,intr,vers=4,addr=xxx.xx.xxx.xxx,clientaddr=xxx.xx.xxx.xxx) version: $ cat /etc/redhat-release CentOS release 6.3 (Final)

    Read the article

  • Disable the Old Adobe Flash Plugin in Google Chrome

    - by The Geek
    If you’ve just updated to the Dev or Beta release of Google Chrome, you might have noticed that a special version of Adobe Flash is now integrated into the default distribution of Chrome. But what about your old plug-in? As it turns out, the old plug-in is generally still installed… but you can easily disable Chrome plug-ins in the latest version, so let’s get to work. Disable the Extra Flash Plug-in Head over to about:plugins and look through the list—you should notice two Shockwave Flash plugins. The first one should be in your Google Chrome installation folder, and has the filename gcswf32.dll. This is the NEW one, so don’t disable it! If you keep scrollling down, you’ll see the old one, with the file name NPSWF32.dll. This is the OLD plugin, and you can safely disable it. Of course, if you only use Chrome you could just completely uninstall Adobe Flash from your system by heading into Control Panel’s Uninstall Programs screen, and then finding and uninstalling Adobe Flash Player Plugin. The ActiveX version is for Internet Explorer. We’ve not done any testing to see if the old Flash plugin is even still active or not, but may as well disable it just to be sure, right? Similar Articles Productive Geek Tips How To Disable Individual Plug-ins in Google ChromeSearch for Install Packages from the Ubuntu Command LineStop YouTube Videos from Automatically Playing in ChromeHow To Disable Javascript in Adobe Reader and Patch the Latest Massive Security HoleStupid Geek Tricks: Compare Your Browser’s Memory Usage with Google Chrome TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Outlook Connector Upgrade Error Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird

    Read the article

  • VHOST not working in Apache

    - by Starx
    I got a LAMP server working in Ubuntu 11.04. Now the problem is that the websites have to enabled and disabled from the terminal. All all of them have to be accessed from http://localhost which is not so much efficient. So I created a VHOSTS, using some tutorials off the net. Here is the coding for it <VirtualHost *:80> ServerAdmin webmaster@localhost Servername site.com ServerAlias www.site.com DocumentRoot /home/starx/public_html/site/public <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/starx/public_html/site/public> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> nGen ErrorLog ${APACHE_LOG_DIR}/site-error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/site-access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> Now, still I can't access the page in http://site.com but if i access using http://localhsot/ it is accessed. I have disabled all others site including default and have just enabled one site i.e. site How to fix this?

    Read the article

  • Problems with both LightDM and GDM using DisplayLink USB monitor

    - by Austin
    When I use LightDM, it will auto-login to desktop just fine. The only problem is Compiz doesn't work, and menus don't work. I can't right-click the desktop, and I can't select program menus in the top bar (I.e clicking "File" does nothing). When I use GDM, I only get a blank blue screen and the mouse cursor. I can't Ctrl+Alt+Backspace to restart, but I can Ctrl+Alt+F1 and Ctrl+Alt+F7 to switch modes. I don't think it's auto-logging me in, but I'm not sure. It plays the login screen noise. Will update with more information when I get home! EDIT: Okay, so I did a fresh install, just to ensure I hadn't borked something playing in the console. I reconfigured my setup as I did before, with the same results. Here's what I followed. The only difference is that instead of setting "vga=normal nomodeset" I set "GRUB_GFXPAYLOAD_LINUX = text". Also I only have the DisplayLink monitor configured in my xorg.conf file. At this point I'm using the open radeon driver, although I used the proprietary ati driver before. I'm not sure if I'm having a problem with: - X configuration - Graphics driver - DisplayLink driver - Unity - LightDM - Compiz - Or something else The resolution of the monitor is 800x480, 16bit. I tried setting a larger virtual resolution of 1200x720 (because the real resolution is lower than the recommended resolution), but it causes Ubuntu to boot into low graphics mode. When I get home I'm going to install the fglrx driver and see if it enables virtual resolutions, which may further enable my window manager to function properly.

    Read the article

  • Friday Fun: Wake Up the Box

    - by Mysticgeek
    Another Friday and it’s time to waste the rest of your Friday playing a  fun flash game online. Today we take a look at a relaxing physic based puzzle game called Wake Up the Box. Wake Up the Box This goal of this game is to wake up the box character by attaching parts of existing wood objects in each stage. You can start a new game or continue your progress from where you left off. At the beginning you get a tutorial showing what you need to do to wake the box. You get wood parts and can attach them to other wood pieces but not metal or brick. After successfully waking up Mr. Box, you can go to the next level or restart a level at any time if your having problems figuring out the puzzle. Each level gets more difficult and the puzzles are more challenging. Wake Up the Box is a relaxing and challenging game that will allow you to have fun, not working on TPS reports until the whistle blows. Play Wake Up the Box at FreeWebArcade Similar Articles Productive Geek Tips Stop the Mouse From Waking Up Your Computer from Sleep ModeFix "Sleep Mode Randomly Waking Up" Issue in Windows VistaStop Your Mouse from Waking Up Your Windows 7 ComputerPrevent Windows Asking for a Password on Wake Up from Sleep/StandbyUse Sleep.FM to Wake Up with the Web TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff

    Read the article

  • Critique My Backup and Storage Plan

    - by MetaHyperBolic
    My current storage (RAID-1 off of a hardware RAID card) and backup (a spare drive) solutions for my home network are inadequate. I have too much data scattered on various one-off drives. It is time to evolve. Backups seem simple enough, at least: lots of big drives. However, I am bewildered by the number of choices for small home storage. The Drobo S looks appealing. So does the ReadyNAS. I am not looking for bunches of shiny features, I'm mostly interested in reliability. I am not interested in building Yet Another PC to create a file server or doing something in the cloud, or whatever. I'm stupid, so I am keeping it simple. Requirements for Main Volume: Starting working space roughly 2TB, with options for growth up to 5TB RAID or something RAID-like with at least one parity drive eSATA II for speed during backups Ability to shut down gracefully when alerted of low power by a UPS Optional but Desirable: Will take 2TB drives now with options for the larger 3TB drives coming in 2010-2011 Optional but Desirable: : RAID-6 or something similar, with two parity drives Optional but Desirable: : Hot spare Ethernet connection not required, as the volume will be shared via the same machines which runs my home print server Backups: Backup performed via ROBOCOPY in mirror mode to an external hard drive via a eSATA II connection. Start with rotating between two external 2TB hard drives, will go up to six external 2TB drives. Start with a weekly backup, move to a bi-weekly backup as more drives are added. Move to 3TB drives as the size of my main volume increases. Backup drives will be stored on an off-site location. Hard drives: I plan on buying all of the same model, but different batches from different vendors. I found a "burn-in" utility with which I can pound away on the drives for a couple of weeks before adding them to the backup pool or the main volume. I estimate that I am looking at roughly $1,500 to start, once I start throwing in two TB drives for backup and four for storage. So, are there any obvious flaws in my plan? What have I overlooked? Any suggestions for the storage device for my main volume that fits my requirements? Or do I just keep it simple, 2 drives in RAID-1, then perform due diligence with my backups, accepting that I will have to buy a whole new unit when my data grows past 2TB?

    Read the article

  • PLesk Error: pmm-ras error (Error code = -6): during restore.. /tmp folder to increase?

    - by eric
    I had to re-install plesk on a centos 6 system after a crash. The full backup file is 11 gb. but at the beginning of the backup reinstall I get the error Error: pmm-ras error (Error code = -6): argh ! my disk organization is like this Filesystem Size Used Avail Use% Mounted on /dev/xvda1 3.7G 801M 2.9G 22% / /dev/mapper/vg00-usr 14G 1.5G 12G 12% /usr /dev/mapper/vg00-var 155G 14G 134G 10% /var /dev/mapper/vg00-home 3.9G 136M 3.6G 4% /home none 1000M 7.5M 993M 1% /tmp I suppose I have to increase my /tmp folder to accept the backup size,but I don't know how-to. I'm on 1&1 cloud server Thanks for your help. You can imagine the emergency of this situation...

    Read the article

  • Apache Server-Side Includes Refuse to Work (Tried everything in the docs but still no joy)

    - by raindog308
    Trying to get apache server-side includes to work. Really simple - just want to include a footer on each page. Apache 2.2: # ./httpd -v Server version: Apache/2.2.21 (Unix) Server built: Dec 4 2011 18:24:53 Cpanel::Easy::Apache v3.7.2 rev9999 mod_include is compiled in: # /usr/local/apache/bin/httpd -l | grep mod_include mod_include.c And it's in httpd.conf: # grep shtml httpd.conf AddType text/html .shtml DirectoryIndex index.html.var index.htm index.html index.shtml index.xhtml index.wml index.perl index.pl index.plx index.ppl index.cgi index.jsp index.js index.jp index.php4 index.php3 index.php index.phtml default.htm default.html home.htm index.php5 Default.html Default.htm home.html AddHandler server-parsed .shtml AddType text/html .shtml In the web directory I created a .htaccess with Options +Includes And then in the document, I have: <h1>next should be the include</h1> <!--#include virtual="/footer.html" --> <h1>include done</h1> And I see nothing in between those headers. Tried file=, also with/without absolute path. Is there something else I'm missing? I see the same thing on another unrelated server (more or less stock CentOS 6), so I suspect the problem is between keyboard and chair...

    Read the article

  • Manage Sending 2010 Documents to the Web with Office Upload Center

    - by Mysticgeek
    One of the main new features being touted in Office 2010 is the ability to upload documents to the Web for sharing and collaboration. Today we look at using Office Upload Center to help manage your uploaded documents. Microsoft Office Upload Center  When you upload an Office 2010 document to the web, a handy tool to manage them is the Office Upload Center. It’s a way to see what is being uploaded or what might have failed to reach the servers. It lets you know if a document failed to upload for some reason. In this case it looks like the incorrect credentials were entered when signing into Windows Live. Click on the Resolve button to get a list of actions you can take to get things corrected.   You can access the Upload Center from the icon which appears on the System Tray when uploading documents. Right-click the icon to control notifications, pause uploads, and access its settings. In the Settings section you can choose how Upload Center displays notifications, select the number of days to keep files in Cache, and delete currently cached files. If you find yourself uploading several documents to the web during the day, the Office Upload Center is a nice feature for managing them. Similar Articles Productive Geek Tips How To Upload Office 2010 Documents to Web Apps Technical PreviewStore, Edit, and Share Documents with Microsoft Web AppsHow To Rip a Music CD in Windows 7 Media CenterKeep Your Office 2007 Documents Readily Available the Easy WayMake Excel 2007 Always Save in Excel 2003 Format TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Greate Image Viewing and Management with Zoner Photo Studio Free Windows Media Player Plus! – Cool WMP Enhancer Get Your Team’s World Cup Schedule In Google Calendar Backup Drivers With Driver Magician TubeSort: YouTube Playlist Organizer XPS file format & XPS Viewer Explained

    Read the article

  • How should I remotely manage Dell Poweredge 2850 running Ubuntu server?

    - by Saul
    First I've got to say I'm a Linux / Ubuntu novice, so go gentle on me as I'm on day 3. I've managed to get Ubuntu server Ubuntu 8.04 LTS installed and running on the Poweredge 1850 I bought off ebay. The box will go in a rack at my office but I want to be able to work on it and power on and off from home and I gather that (maybe) IPMI over LAN might be the way to do this, or maybe its something to do with BMC or something? I want to be able to administer/manage from a client PC at home running XP. I will be configuring the office router to port forward port 80 and 443 to the Ubuntu server running Apache2, and I'm puzzled about how the remote management works (unless it comes on a different port forwarded to a different internal IP) Thanks for any help

    Read the article

  • What are Information Centers?

    - by user12244613
    Information Centers are similar to product pages in the Oracle Sun System Handbook Many customers like the Oracle Sun System Handbook concept of a home page with all the product attributes, troubleshooting etc. access from a single home page. This concept is now available for a range of Oracle Solaris, Systems, and Storage products. The Information Center for each product covers areas such as: Overview, Hot Topics, Patching and Maintenance. The Information Center pages are dynamically generated each night to ensure the latest content is available to you. Here are the top Solaris, Systems, and Storage Information Centers: Oracle Explorer Data Collector Oracle Solaris 10 Live Upgrade Oracle Solaris 11 Booting Information Center Oracle Solaris 11 Desktop and Graphics Information Center Oracle Solaris 11 Image Packaging System (IPS) Information Center Oracle Solaris 11 Installation Information Center Oracle Solaris 11 Product Information Center Oracle Solaris 11 Security Information Center Oracle Solaris 11 System Administration Information Center Oracle Solaris 11 Zones Information Center Oracle Solaris Crash Analysis Tool(SCAT) - Information Center Oracle Solaris Cluster Information Center Oracle Solaris Internet Protocol Multipathing (IPMP) Information Center Oracle Solaris Live Upgrade Information Center Oracle Solaris ZFS Information Center Oracle Solaris Zones Information Center CMT T1000/T2000 and Netra T2000 CMT T5120/T5120/T5140/T5220/T5240/T5440 Systems M3000/M4000/M5000/M8000/M9000-32/M9000-64 Management and Diagnostic Tools for Oracle Sun Systems Netra CT410/810 and Netra CT900 Network-Attached Storage (NAS) Oracle Explorer Data Collector Oracle VM Server for SPARC (LDoms) Pillar Axiom 600 SL3000 Tape Library Sun Disk Storage Patching and Updates Sun Fire 3800/4800/4810/6800/E2900/E4900/E6900/V1280 - Netra 1280/1290 Sun Fire 12K/15K/E20K/E25K Sun Fire X4270 M2 Server Sun x86 Servers T3 and T4 Systems Tape Domain Firmware V210/V240/V440/V215/V245/V445 Servers VSM (VTSS/VLE/VTCS)

    Read the article

  • How do i make my existing ubuntu in a bootable installation CD? I tried remastersys but fails with 11.10

    - by YumYumYum
    I need to install 10 PC which has identical setup and hardware. So i was trying remastersys but its failing. How can i resolve this or use something else to achieve this? Updating the remastersys.log cat: /home/remastersys/remastersys/tmpusers: No such file or directory Cleaning up the install icon from the user desktops Removing the ubiquity frontend as it has been included and is not needed on the normal system Calculating the installed filesystem size for the installer Removing remastersys-firstboot from system startup Removing any system startup links for /etc/init.d/remastersys-firstboot ... /etc/rc0.d/K20remastersys-firstboot /etc/rc1.d/K20remastersys-firstboot /etc/rc2.d/S20remastersys-firstboot /etc/rc3.d/S20remastersys-firstboot /etc/rc4.d/S20remastersys-firstboot /etc/rc5.d/S20remastersys-firstboot /etc/rc6.d/K20remastersys-firstboot Making disk compatible with Ubuntu Startup Disk Creator. Creating md5sum.txt for the livecd/dvd Creating /var/tmp/custom.iso in /home/remastersys/remastersys The iso was not created. There was a problem. Exiting Follow up: 1) I am unhappy that there is nothing exist for this to recover/backup 11.10 2) Anyway i have to do it 3) I did not used the popular Clonezilla because it does not offer me iso 4) I downloaded: http://clonezilla-sysresccd.hellug.gr/download.html a) created a bootable CD from that ISO b) booted and followed those steps http://clonezilla-sysresccd.hellug.gr/restore.html c) i got a iso file with everyhing on it including boot-loaders 5) then in another system i used my same CD to restore my image Perfectly worked.

    Read the article

  • Ubuntu to Ubuntu VNC over SSH tunnel

    - by rxt
    I have a Linux Ubuntu desktop at home, ssh enabled, vnc server installed, router rule configured. It all works, and at home I can connect via the local network from my Mac. From the outside I can login via ssh. I've configured putty as follows: session: host name and port number connection ssh tunnel: forwarded ports: L5900|192.168.0.23 the local address is: 192.168.1.45 When I make the connection I can login to the remote machine. Then I open Remote Desktop Viewer. I click connect protocol: vnc host: ? use host as ssh tunnel: ? I don't know what to use for the last two options. Which ip-addresses should I use?

    Read the article

  • Running make for Nginx throws a “multiple target patterns” error

    - by Justin Meltzer
    When I run make inside my installed nginx directory I get the output: make -f objs/Makefile make[1]: Entering directory `/home/ec2-user/nginx/nginx-1.2.4' objs/Makefile:110: *** multiple target patterns. Stop. make[1]: Leaving directory `/home/ec2-user/nginx/nginx-1.2.4' make: * [build] Error 2 I am on an Amazon Linux AMI. The steps I took from the beginning was wget /path/to/nginx/tarball tar xvf nginx-1.2.4.tar.gz cd nginx-1.2.4 ./configure --prefix=/nginx --a-bunch-of-other-options Then I ran make. Also I installed make by running sudo yum install make Please let me know if there's any other information I should be providing.

    Read the article

  • Breadcrumbs RDFA

    - by Saahil Sinha
    Have implemented Breadcrumb RDFA http://www.mycarhelpline.com/index.php?option=com_forms&view=pages&layout=sellcar&Itemid=4 While checking the page , the RDFA Data shows property: title: Home https://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.mycarhelpline.com%2Findex.php%3Foption%3Dcom_forms%26view%3Dpages%26layout%3Dsellcar%26Itemid%3D4 However, when i compare ours with others http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Froyalenfield.com%2Fmotorcycles%2Fthunderbird-500%2F The Title and Description of the current page is shown in every RDFA Data, which is not shown in ours If someone could suggest - how to get the page title and description show up in RDFA Data, below is our breadcrumb code <p><span class="breadcrumbs pathway"> <span typeof="v:Breadcrumb"> <a href="" rel="v:url" property="v:title">Home</a> &raquo; <span rel="v:child"> <span typeof="v:Breadcrumb"> <a href="index.php?option=com_forms&view=pages&layout=selloldcarindelhi&Itemid=4" rel="v:url" property="v:title">Sell Car</a> &raquo; <span rel="v:child"> <span typeof="v:Breadcrumb"> <a property="v:title" >Sell Used Car</a> </span> </span> </span> </span> </span>

    Read the article

  • [MINI HOW-TO] Remove the Search Helper Extension from Firefox

    - by Asian Angel
    If you found a new surprise extension added to Firefox after the June Patch from Microsoft, then you are likely to be rather unhappy right now. Join us as we show you how to remove the Search Helper extension from your browser. An Unexpected Addition to Your Extensions You may be wondering what the new mysterious extension that showed up is for. Its’ purpose is to help the Bing Toolbar better integrate with your browser. Unless you have the Bing Toolbar installed you really do not need this cluttering your browser up. So how do you get rid of it? Removing the Extension In order to remove the extension you will need to navigate to the following location: C:\Program Files\Microsoft\Search Enhancement Pack\Search Helper Once there delete the “firefoxextension folder”…that is all there is to it. If you want to remove the search helper add-on for Internet Explorer then delete the “SEPsearchhelperie.dll file” while you are here. Note: You may need to have administrator rights in order to delete the folder. No more Search Helper Extension! If you are unhappy about this update being snuck into your system, following these instructions will remove it. Microsoft Support Page About Update KB982217 Similar Articles Productive Geek Tips Remove the New Tab Button in FirefoxAdd Search Forms to the Firefox Search BarAdd Notes to Zoho Notebook in FirefoxOrganize Your Firefox Search Engines Into FoldersManually Remove Skype Extension from Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Steve Jobs’ iPhone 4 Keynote Video Watch World Cup Online On These Sites Speed Up Windows With ReadyBoost Awesome World Cup Soccer Calendar Nice Websites To Watch TV Shows Online 24 Million Sites

    Read the article

  • Cron Permission Denied

    - by worldthreat
    good day, I have a bash script in my home directory that works properly from the command line (file structure is default media temple DV. < noted for certain permission issues) but receive this error from cron: "/home/myFile.sh: line 2: /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql: Permission denied" NOTICE: it's just line 2... it writes to the local server just fine. Below is the Bash File: #!/bin/bash mysqldump -uUSER -pPASSWORD -hHOST dbName> /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql mysql -uadmin -pPASSWORD -hlocalhost dbName< /var/www/vhosts/domain.com/subdomains/techspatch/installation.sql can't chmod from bash (lol, yeah i tried). writing the file there and setting the permissions before the transfer is useless... i have googled the heck out of this situation and this one still seems unique.... any insight is appreciated

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >