Search Results

Search found 10107 results on 405 pages for 'remote backups'.

Page 4/405 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • After upgrading to 12.04 from 10.10 my mythbuntu standard MCEUSB remote no longer works

    - by keepitsimpleengineer
    I had no problems using my Windows Media Center Remote with 10.10 Mythbuntu, but after upgrading, it no longer affects Mythbuntu. I have verified and re-installed it in Mythbuntu Control Centre. I have used irw to verify the ir buttons actions are properly received by the HTPC. How do I go about fixing this? 3.2.0-26-generic (#41-Ubuntu SMP Thu Jun 14 17:49:24 UTC 2012) Xorg version: 1.11.3 (16 July 2012 08:06:31PM) GCC: 4.6 (x86_64-linux-gnu) Current updates as of 2012?07?21 $cat /etc/lirc/hardware.con #Chosen Remote Control REMOTE="Windows Media Center Transceivers/Remotes (all)" REMOTE_MODULES="lirc_dev mceusb" REMOTE_DRIVER="" REMOTE_DEVICE="/dev/lirc0" REMOTE_SOCKET="" REMOTE_LIRCD_CONF="mceusb/lircd.conf.mceusb" REMOTE_LIRCD_ARGS="" #Chosen IR Transmitter TRANSMITTER="None" TRANSMITTER_MODULES="" TRANSMITTER_DRIVER="" TRANSMITTER_DEVICE="" TRANSMITTER_SOCKET="" TRANSMITTER_LIRCD_CONF="" TRANSMITTER_LIRCD_ARGS="" #Enable lircd START_LIRCD="true" #Don't start lircmd even if there seems to be a good config file #START_LIRCMD="false" #Try to load appropriate kernel modules LOAD_MODULES="true" # Default configuration files for your hardware if any LIRCMD_CONF="" #Forcing noninteractive reconfiguration #If lirc is to be reconfigured by an external application #that doesn't have a debconf frontend available, the noninteractive #frontend can be invoked and set to parse REMOTE and TRANSMITTER #It will then populate all other variables without any user input #If you would like to configure lirc via standard methods, be sure #to leave this set to "false" FORCE_NONINTERACTIVE_RECONFIGURATION="false" START_LIRCMD="" # lsusb | grep -i infrared Bus 003 Device 002: ID 0471:0815 Philips (or NXP) eHome Infrared Receiver

    Read the article

  • Problem with wake after suspend using USB remote.

    - by Bod
    Hi, I'm a linux newbie looking for some help. I'm currently setting up an XBMC HTPC using a laptop and 10.10 and all works great except for waking from resume using the power button on the remote. The suspend works from remote works fine as does the resume using the power button on the laptop. I've checked /proc/acpi/wakeup which initially showed the following. Device S-state Status Sysfs node C096 S5 *disabled pci:0000:00:1e.0 C0F1 S3 *disabled pci:0000:00:1d.0 C0F8 S3 *disabled pci:0000:00:1d.1 C0F9 S3 *disabled pci:0000:00:1d.2 C0FA S3 *disabled pci:0000:00:1d.3 C0FB S3 *disabled pci:0000:00:1d.7 C102 S5 *disabled pci:0000:00:1c.0 C22B S5 *disabled pci:0000:08:00.0 C115 S5 *disabled pci:0000:00:1c.2 C22C S5 *disabled C118 S5 *disabled pci:0000:00:1c.3 C22C S5 *disabled I've since configured the above so that the S3 devices above are enabled. I've confirmed that they are the correct devices using lspci 00:1d.0 USB Controller: Intel Corporation N10/ICH 7 Family USB UHCI Controller #1 (rev 01) 00:1d.1 USB Controller: Intel Corporation N10/ICH 7 Family USB UHCI Controller #2 (rev 01) 00:1d.2 USB Controller: Intel Corporation N10/ICH 7 Family USB UHCI Controller #3 (rev 01) 00:1d.3 USB Controller: Intel Corporation N10/ICH 7 Family USB UHCI Controller #4 (rev 01) 00:1d.7 USB Controller: Intel Corporation N10/ICH 7 Family USB2 EHCI Controller (rev 01) None of this has worked unfortunately and I'm now stuck. It simply refuses to wakeup from the remote. The USB receiver shows no activity LED while suspended. Suspend/resume from the remote works fine from Windows 7 so I know the laptop is ok with it. Any ideas? I need to get this sorted to gain Wife Approval for this system. Thanks, Bod.

    Read the article

  • Remote Desktop advice

    - by spoon16
    Coming from Windows, so that is what my expectations are based on. I have a Ubuntu desktop edition instance running as a virtual machine on a server. I would like to use it as my primary open source dev environment but the VNC tools I have used don't seem to be as rich as "Remote Desktop Connection" in Windows. The two things that are missing for me: connecting/logging into a non-console user sessions dynamically resizing the graphical resolution based on the size of the remote desktop window device sharing (USB devices plugged into client shared with remote) Is there an appropriate client that I can run on Windows to connect to my ubuntu dev instance that provides these capabilities?

    Read the article

  • Forcing users to change password on first login - Windows Server 2008 R2 Remote Desktop Services

    - by George Durzi
    I'm setting up a demo lab environment in which each demo lab user is assigned 4 accounts to use in the lab. Users access the lab via Remote Desktop to the "client" machine in the lab - exposed at demolab.mydomain.com. The Client machine is a Windows 2008 Server R2 Enterprise Edition server The Remote Desktop Services role is configured on this server Remote Connection settings are configured to allow users to connect with any version of the Remote Desktop Client All accounts are members of the local Administrators and Remote Desktop Users groups All accounts are configured to be forced to change the default password after first login The user is instructed to remote into the lab with an account designated as their main account, and establish 3 more remote desktop sessions within the lab using their 3 other assigned demo lab accounts. When establishing the initial remote desktop connection to the lab using their main account, the user sees the change password dialog as expected. However, after logging in and trying to establish remote desktop connections to the server with their three other accounts, they are prompted that they need to change the password after logging in but can't continue with the login process - they don't see the expected change password experience. After logging in with a primary accounts, it doesn't make a difference if I try establishing a Remote Desktop connection to the environment using the name of the server, e.g. Client, or demolab.mydomain.com. I experimented with changing the settings for Remote Connections to require NLA but that didn't make a different. Appreciate any tips. Thanks

    Read the article

  • How to configure Remote desktop on window server 2008 R2?

    - by Abdullah BaMusa
    I’m trying to connect over internet to my home workstation which has Windows Server 2008 R2 (Web Edition) installed from my PC at work (Windows 7 installed on it) via Remote Desktop. I configure the workstation to accept remote desktop and I can connect to it from my laptop if I’m within same Home LAN but I can’t establish the connection from my PC at work . My question is: Is possible to connect to my workstation over internet using remote desktop? Is there any step by step resource the setup this feature?

    Read the article

  • How to configure Remote desktop on window server 2008 R2?

    - by Abdullah BaMusa
    I’m trying to connect over internet to my home workstation which has Windows Server 2008 R2 (Web Edition) installed from my PC at work (Windows 7 installed on it) via Remote Desktop. I configure the workstation to accept remote desktop and I can connect to it from my laptop if I’m within same Home LAN but I can’t establish the connection from my PC at work . My question is: Is possible to connect to my workstation over internet using remote desktop? Is there any step by step resource the setup this feature?

    Read the article

  • How to configure Remote desktop on window server 2008 R2?

    - by Abdullah BaMusa
    I’m trying to connect over internet to my home workstation which has Windows Server 2008 R2 (Web Edition) installed from my PC at work (Windows 7 installed on it) via Remote Desktop. I configure the workstation to accept remote desktop and I can connect to it from my laptop if I’m within same Home LAN but I can’t establish the connection from my PC at work . My question is: Is possible to connect to my workstation over internet using remote desktop? Is there any step by step resource the setup this feature?

    Read the article

  • How to render remote assistance to a person using Live Messenger?

    - by Cheeso
    There is a feature within Windows Live Messenger v9 that allows a person to ask for remote assistance. BBut as I understand it, this works only if the router is UPnP enabled on both ends. Today I tried this with a friend during an active chat session, and nothing happened. I suspect a router problem. as I am remote, I cannot configure the router for them. What's a good way to render remote assistance? Here's the scenario: it will be based on invitation only (it's not a remote desktop or "logmein" situation). It's a younger person, a computer novice, on the other end of the wire. I'll be assiting with their use of applications on the PC. I'd l ike to be able to SEE the screen, and also use the mouse and keyboard. I have used Ultra-Vnc on the target machine and vncviewer on my machine, on a LAN. It works well. But I don't think I can use that, because it's my kids' computer in my ex-wife's place, and I don't want her to accuse me of spying on her computer. That's why I need it to be invitation only. Advice please. Is there an easy way for me to set up Remote Assistance? IS there some other tool I can use?

    Read the article

  • How to connect through a proxy using Remote Desktop?

    - by scottmarlowe
    So I've got a home server running Windows Server 2003. I use a dual network card setup and Routing and Remote Access to link the internal, private network to the external connection. The external connection hooks directly to my cable modem (so no routers or other devices sitting between). The problem I'm having is that I can't connect remotely from a location outside the house (so connecting to the server's external connection) to the server using either Remote Desktop or VNC. I have enabled both ports in Routing and Remote Access's firewall to allow access, and I have enabled Remote Desktop in Windows Server 2003. The odd thing is that I can access my home server's SVN repository and I can even ping the server's IP. I am using the IP to attempt to connect, though I use a dyndns.com provided name to connect to my SVN repository, so it shouldn't make a difference (I know the IP is getting resolved correctly). Any ideas on where to start diagnosing this one? I haven't seen anything in my server's event log. If any other info is needed, let me know. Thanks. UPDATE: One last piece of information: We use a proxy server at work, which I'm nearly 100% sure is the culprit. I have a workaround--if I connect to our VPN (even though I'm already inside the building) I am able to connect to my home server. This is with VNC. However, is there a way to connect through a proxy using Remote Desktop? ONE MORE UPDATE: Indeed, it was the http proxy I'm sitting behind at work that was causing the issue. An acceptable workaround is to use my VPN connection to bypass the proxy, and I'm in!

    Read the article

  • Web hosting providers for businesses (with offsite backups, disaster recovery options, etc.) [closed]

    - by Harry Muscle
    Possible Duplicate: How to find web hosting that meets my requirements? I'm wondering if anyone can point me in the direction of a couple of web hosting providers that are geared towards businesses. By this I mean providers that make it easy to create daily off site backups, are aware that websites require disaster recovery options and have these in place or are able to assist with them, etc. We currently have about a dozen sites with various providers, however, I've been asked to consolidate all of these into one provider and create a full disaster recovery plan. Unfortunately it seems like most providers are geared towards average users that don't require all these extra bells and whistles that businesses need. For example, HostGator, which is a very popular and well reviewed provider, doesn't even allow you to schedule full backups, they have to be manually requested via cPanel and then downloaded once available. If anyone can point out a couple companies that might be able to help with these sorts of things that would be much appreciated. Thanks, Harry P.S. I should also add that we are hoping to stay away from having to manage our own server, we're hoping for a fully managed solution like what HostGator would offer for example.

    Read the article

  • DBA Best Practices - A Blog Series: Episode 1 - Backups

    - by Argenis
      This blog post is part of the DBA Best Practices series, on which various topics of concern for daily database operations are discussed. Your feedback and comments are very much welcome, so please drop by the comments section and be sure to leave your thoughts on the subject. Morning Coffee When I was a DBA, the first thing I did when I sat down at my desk at work was checking that all backups had completed successfully. It really was more of a ritual, since I had a dual system in place to check for backup completion: 1) the scheduled agent jobs to back up the databases were set to alert the NOC in failure, and 2) I had a script run from a central server every so often to check for any backup failures. Why the redundancy, you might ask. Well, for one I was once bitten by the fact that database mail doesn't work 100% of the time. Potential causes for failure include issues on the SMTP box that relays your server email, firewall problems, DNS issues, etc. And so to be sure that my backups completed fine, I needed to rely on a mechanism other than having the servers do the taking - I needed to interrogate the servers and ask each one if an issue had occurred. This is why I had a script run every so often. Some of you might have monitoring tools in place like Microsoft System Center Operations Manager (SCOM) or similar 3rd party products that would track all these things for you. But at that moment, we had no resort but to write our own Powershell scripts to do it. Now it goes without saying that if you don't have backups in place, you might as well find another career. Your most sacred job as a DBA is to protect the data from a disaster, and only properly safeguarded backups can offer you peace of mind here. "But, we have a cluster...we don't need backups" Sadly I've heard this line more than I would have liked to. You need to understand that a cluster is comprised of shared storage, and that is precisely your single point of failure. A cluster will protect you from an issue at the Operating System level, and also under an outage of any SQL-related service or dependent devices. But it will most definitely NOT protect you against corruption, nor will it protect you against somebody deleting data from a table - accidentally or otherwise. Backup, fine. How often do I take a backup? The answer to this is something you will hear frequently when working with databases: it depends. What does it depend on? For one, you need to understand how much data your business is willing to lose. This is what's called Recovery Point Objective, or RPO. If you don't know how much data your business is willing to lose, you need to have an honest and realistic conversation about data loss expectations with your customers, internal or external. From my experience, their first answer to the question "how much data loss can you withstand?" will be "zero". In that case, you will need to explain how zero data loss is very difficult and very costly to achieve, even in today's computing environments. Do you want to go ahead and take full backups of all your databases every hour, or even every day? Probably not, because of the impact that taking a full backup can have on a system. That's what differential and transaction log backups are for. Have I answered the question of how often to take a backup? No, and I did that on purpose. You need to think about how much time you have to recover from any event that requires you to restore your databases. This is what's called Recovery Time Objective. Again, if you go ask your customer how long of an outage they can withstand, at first you will get a completely unrealistic number - and that will be your starting point for discussing a solution that is cost effective. The point that I'm trying to get across is that you need to have a plan. This plan needs to be practiced, and tested. Like a football playbook, you need to rehearse the moves you'll perform when the time comes. How often is up to you, and the objective is that you feel better about yourself and the steps you need to follow when emergency strikes. A backup is nothing more than an untested restore Backups are files. Files are prone to corruption. Put those two together and realize how you feel about those backups sitting on that network drive. When was the last time you restored any of those? Restoring your backups on another box - that, by the way, doesn't have to match the specs of your production server - will give you two things: 1) peace of mind, because now you know that your backups are good and 2) a place to offload your consistency checks with DBCC CHECKDB or any of the other DBCC commands like CHECKTABLE or CHECKCATALOG. This is a great strategy for VLDBs that cannot withstand the additional load created by the consistency checks. If you choose to offload your consistency checks to another server though, be sure to run DBCC CHECKDB WITH PHYSICALONLY on the production server, and if you're using SQL Server 2008 R2 SP1 CU4 and above, be sure to enable traceflags 2562 and/or 2549, which will speed up the PHYSICALONLY checks further - you can read more about this enhancement here. Back to the "How Often" question for a second. If you have the disk, and the network latency, and the system resources to do so, why not backup the transaction log often? As in, every 5 minutes, or even less than that? There's not much downside to doing it, as you will have to clear the log with a backup sooner than later, lest you risk running out space on your tlog, or even your drive. The one drawback to this approach is that you will have more files to deal with at restore time, and processing each file will add a bit of extra time to the entire process. But it might be worth that time knowing that you minimized the amount of data lost. Again, test your plan to make sure that it matches your particular needs. Where to back up to? Network share? Locally? SAN volume? This is another topic where everybody has a favorite choice. So, I'll stick to mentioning what I like to do and what I consider to be the best practice in this regard. I like to backup to a SAN volume, i.e., a drive that actually lives in the SAN, and can be easily attached to another server in a pinch, saving you valuable time - you wouldn't need to restore files on the network (slow) or pull out drives out a dead server (been there, done that, it’s also slow!). The key is to have a copy of those backup files made quickly, and, if at all possible, to a remote target on a different datacenter - or even the cloud. There are plenty of solutions out there that can help you put such a solution together. That right there is the first step towards a practical Disaster Recovery plan. But there's much more to DR, and that's material for a different blog post in this series.

    Read the article

  • DBA Best Practices - A Blog Series: Episode 1 - Backups

    - by Argenis
      This blog post is part of the DBA Best Practices series, on which various topics of concern for daily database operations are discussed. Your feedback and comments are very much welcome, so please drop by the comments section and be sure to leave your thoughts on the subject. Morning Coffee When I was a DBA, the first thing I did when I sat down at my desk at work was checking that all backups have completed successfully. It really was more of a ritual, since I had a dual system in place to check for backup completion: 1) the scheduled agent jobs to back up the databases were set to alert the NOC in failure, and 2) I had a script run from a central server every so often to check for any backup failures. Why the redundancy, you might ask. Well, for one I was once bitten by the fact that database mail doesn't work 100% of the time. Potential causes for failure include issues on the SMTP box that relays your server email, firewall problems, DNS issues, etc. And so to be sure that my backups completed fine, I needed to rely on a mechanism other than having the servers do the taking - I needed to interrogate the servers and ask each one if an issue had occurred. This is why I had a script run every so often. Some of you might have monitoring tools in place like Microsoft System Center Operations Manager (SCOM) or similar 3rd party products that would track all these things for you. But at that moment, we had no resort but to write our own Powershell scripts to do it. Now it goes without saying that if you don't have backups in place, you might as well find another career. Your most sacred job as a DBA is to protect the data from a disaster, and only properly safeguarded backups can offer you peace of mind here. "But, we have a cluster...we don't need backups" Sadly I've heard this line more than I would have liked to. You need to understand that a cluster is comprised of shared storage, and that is precisely your single point of failure. A cluster will protect you from an issue at the Operating System level, and also under an outage of any SQL-related service or dependent devices. But it will most definitely NOT protect you against corruption, nor will it protect you against somebody deleting data from a table - accidentally or otherwise. Backup, fine. How often do I take a backup? The answer to this is something you will hear frequently when working with databases: it depends. What does it depend on? For one, you need to understand how much data your business is willing to lose. This is what's called Recovery Point Objective, or RPO. If you don't know how much data your business is willing to lose, you need to have an honest and realistic conversation about data loss expectations with your customers, internal or external. From my experience, their first answer to the question "how much data loss can you withstand?" will be "zero". In that case, you will need to explain how zero data loss is very difficult and very costly to achieve, even in today's computing environments. Do you want to go ahead and take full backups of all your databases every hour, or even every day? Probably not, because of the impact that taking a full backup can have on a system. That's what differential and transaction log backups are for. Have I answered the question of how often to take a backup? No, and I did that on purpose. You need to think about how much time you have to recover from any event that requires you to restore your databases. This is what's called Recovery Time Objective. Again, if you go ask your customer how long of an outage they can withstand, at first you will get a completely unrealistic number - and that will be your starting point for discussing a solution that is cost effective. The point that I'm trying to get across is that you need to have a plan. This plan needs to be practiced, and tested. Like a football playbook, you need to rehearse the moves you'll perform when the time comes. How often is up to you, and the objective is that you feel better about yourself and the steps you need to follow when emergency strikes. A backup is nothing more than an untested restore Backups are files. Files are prone to corruption. Put those two together and realize how you feel about those backups sitting on that network drive. When was the last time you restored any of those? Restoring your backups on another box - that, by the way, doesn't have to match the specs of your production server - will give you two things: 1) peace of mind, because now you know that your backups are good and 2) a place to offload your consistency checks with DBCC CHECKDB or any of the other DBCC commands like CHECKTABLE or CHECKCATALOG. This is a great strategy for VLDBs that cannot withstand the additional load created by the consistency checks. If you choose to offload your consistency checks to another server though, be sure to run DBCC CHECKDB WITH PHYSICALONLY on the production server, and if you're using SQL Server 2008 R2 SP1 CU4 and above, be sure to enable traceflags 2562 and/or 2549, which will speed up the PHYSICALONLY checks further - you can read more about this enhancement here. Back to the "How Often" question for a second. If you have the disk, and the network latency, and the system resources to do so, why not backup the transaction log often? As in, every 5 minutes, or even less than that? There's not much downside to doing it, as you will have to clear the log with a backup sooner than later, lest you risk running out space on your tlog, or even your drive. The one drawback to this approach is that you will have more files to deal with at restore time, and processing each file will add a bit of extra time to the entire process. But it might be worth that time knowing that you minimized the amount of data lost. Again, test your plan to make sure that it matches your particular needs. Where to back up to? Network share? Locally? SAN volume? This is another topic where everybody has a favorite choice. So, I'll stick to mentioning what I like to do and what I consider to be the best practice in this regard. I like to backup to a SAN volume, i.e., a drive that actually lives in the SAN, and can be easily attached to another server in a pinch, saving you valuable time - you wouldn't need to restore files on the network (slow) or pull out drives out a dead server (been there, done that, it’s also slow!). The key is to have a copy of those backup files made quickly, and, if at all possible, to a remote target on a different datacenter - or even the cloud. There are plenty of solutions out there that can help you put such a solution together. That right there is the first step towards a practical Disaster Recovery plan. But there's much more to DR, and that's material for a different blog post in this series.

    Read the article

  • Why no icons for pcmanfm when run from remote x server

    - by user75430
    pcmanfm works fine when run from a local console, but does not show file icons when run from a remote x session (ssh -X user@machine). Well, that's not quite true - icons for shell scripts show up OK, there are no icons for regular files and folders. There are a load of errors in the X console window "g_object_unref ... G_IS_OBJECT". Why are there no icons for pcmanfm when I run it from a remote x server?

    Read the article

  • Scheduling Automatic Backups for Virtual Private Web Server running CENTOS 6.3 and WHM

    - by Oliver Farrell
    I'm pretty new to administering my own VPS - but thus far am finding it quite a compelling experience. There's something quite refreshing about having complete control over everything it does. One thing that I would like to look at is a suitable backup solution (a few times a day). My current setup is as follows: I'm running a CENTOS 6.3 VPS with a single 25GB hard drive solely for the purpose of hosting websites. I'm using WHM & cPanel for administering them. I now plan on adding an additional hard disk and hooking it up to my VPS. What I'm not sure about is how I get the two disks talking and get the backup process going. I'm not a seasons SSH-er so don't really know where to start. I'm hosting with Serverlove (one of the best hosting providers I've used) and am provided with a number of unique identifiers for each hard disk so I imagine these may play a part in linking them together. I appreciate that this is a little vague (I'm clutching at straws) but any assistance is very much appreciated.

    Read the article

  • How to Remote View and Control Your Android Phone

    - by Jason Fitzpatrick
    If you’ve ever wished you could see your Android phone’s screen on your desktop or remote control it using your mouse and keyboard we’ll show you how in this simple guide to gaining remote access to your Android device. Why would you want to gain access? When you’re done with this tutorial you’ll be able to view your phone’s screen on your computer monitor which is great for: putting your Android notifications right along side other notification boxes on your monitor, using it like an on-monitor caller ID, and taking screenshots and screencasts. Also if your phone is rooted (and it should be! rooting unlocks so many great features) you’ll gain the ability to use your computer’s keyboard and mouse to control your Android phone. Remote keyboard/mouse control is great for inputting data on the tiny screen without needing to peck at the on-screen keyboard. Latest Features How-To Geek ETC RGB? CMYK? Alpha? What Are Image Channels and What Do They Mean? How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image Google Cloud Print Extension Lets You Print Doc/PDF/Txt Files from Web Sites Hack a $10 Flashlight into an Ultra-bright Premium One Firefox Personas Arrive on Firefox Mobile Focus Booster Is a Sleek and Free Productivity Timer What is the Internet? From the Today Show January 1994 [Historical Video] Take Screenshots and Edit Them in Chrome and Iron Using Aviary Screen Capture

    Read the article

  • Blank desktop when logging in via xrdp

    - by nitefrog
    I am trying to access Ubuntu 11.10 using Remote Desktop from a Win 7 machine. I installed xrdp. I launch the Windows remote desktop client and login in. I then get prompted for the user name and password. It then logs in, but all I see is the background, no menus, nothing. I have to kill remote desktop by closing it. Even if I right click , nothing. Any ideas??? The only reason I even went down the RDP road was that VNC would not work either, even after I enabled desktop sharing. I am in a bind as I need to connect to Ubuntu via Windows. In version 8 Ubuntu this was not an issue and it just worked.

    Read the article

  • Remote Desktop Connection can't connect to Windows Server 2012

    - by Guy Thomas
    Mission to Remote Desktop INTO Windows Server 2012 (standalone). Situation: Control Panel, System, Remote Settings, Remote Desktop – Allow All firewalls off Connect attempt using a known IP address (ping works ok) Connect Option as a user who has already logged on. Error message: Remote Access Cannot Connect 1) Remote access not enabled 2) Remote computer turned off 3) Remote computer not available Additional info: The Server 2012 can RDC OUT. The machines I use to connect IN are Windows 7 and Windows 8, they will RDC to other machines. I have fair experience of configuring remote desktop. Question: Is this a fault of beta software on the 2012 server, or is there a new way of getting RDC to work that I am missing?

    Read the article

  • Remote Desktop fails after VPN connection.

    - by Samet Sorgut
    The remote computer is connected with Remote Desktop. When the remote computer is connected to VPN the Remote Destop freezes. It is not possible to connect to the remote computer again via Remote Desktop. What can be done to connect to this remote computer after it establishes a VPN connection? The only thing that comes to my mind is to install a second NIC and configure Remote Desktop to accept connection from this NIC while VPN is working from the other... What do you suggest?

    Read the article

  • Remote Desktop fails after VPN connection

    - by Samet Sorgut
    The local computer (comp 1) is connected to a remote computer (comp 2) with Remote Desktop. On the remote computer (comp 2), I try to establish an VPN connection to a different remote computer (comp 3). Once I try to establish the VPN connection from the remote computer (comp 2) to the second remote computer (comp 3), Remote Desktop freezes on comp 1. It is not possible to connect to comp 2 again via Remote Desktop. What can be done to connect to this remote computer (comp 2) after it establishes a VPN connection? The only thing that comes to my mind is to install a second NIC and configure Remote Desktop to accept connection from this NIC while VPN is working from the other... What do you suggest? EDIT: I want to use the internet connection of the VPN, so all traffic should go over the VPN but still RDP working. My IP: 100.0.0.1 The IP where I'm connecting via RDP: 200.0.0.20 (Mask: 255.255.255.192, Gateway: 200.0.0.193) Where the 200.0.0.1 connects to VPN the IP of the VPN is: 65.254.61.250 Will routing like this help (Command is issued in 200.0.0.20, the RDP location): route ADD 65.254.61.250 MASK 255.255.255.192 200.0.0.193 Couldn't add gives the error: The route addition failed: The parameter is incorrect. I tried before connecting to VPN.

    Read the article

  • Technologies used in Remote Administration applications(not RD)

    - by Michael
    I want to know what kind of technologies are used nowadays as underlying screen capture engine for remote administration software like VNC pcAnywhere TeamViewer RAC Remote Administrator etc.. The programming language is not so important as just to know whether a driver needs to be developed which is polling video memory 30 times per second or there are any com objects built in the Windows kernel to help doing this? I'm not interested in 3rd party components for doing this. Do I have to use DirectX facilities? Just want some start point to develop my own screen stream capture engine, which will be less CPU hog.

    Read the article

  • Brainless Backups

    - by Jesse
    I’m a software developer by trade which means to my friends and family I’m just a “computer guy”. It’s assumed that I know everything about every facet of computing from removing spyware to replacing hardware. I also can do all of this blindly over the phone or after hearing a five to ten word description of the problem over dinner ;-) In my position as CIO of my friends and families I’ve been in the unfortunate position of trying to recover music, pictures, or documents off of failed hard drives on more than one occasion. It’s not a great situation for anyone, and it’s always at these times that the importance of backups becomes so clear. Several months back a friend of mine found himself in this situation. The hard drive on his 8 year old laptop failed and took a good number of his digital photos with it. I think most folks can deal with losing some of their music and even some of their documents, but it really stings to lose pictures of past events and loved ones. After ordering a new laptop, my friend went out and bought an external hard drive so that he could start keeping a backup of his data. As fate would have it, several months later the drive in his new laptop failed and he learned the hard way that simply buying the external hard drive isn’t enough… you actually have to copy your stuff over every once in awhile! The importance of backup and recovery plans is (hopefully) well known in IT organizations. Well executed backup plans are in place, and hopefully the backup and recovery process is tested regularly. When you’re talking about users at home, however, the need for these backups is often understood far too late. Most typical users can’t be expected to remember to backup their data regularly and also don’t always have the know-how to setup automated backups. For my friends and family members in this situation I recommend tools like Dropbox, Carbonite, and Mozy. Here’s why I like them: They’re affordable: Dropbox and Mozy both have free offerings, though most people with lots of music and/or photos to backup will probably exceed the storage limitations of those free plans pretty quickly. Still, all three offer pretty affordable monthly or yearly plans. In my opinion, Carbonite’s unlimited storage plan for $50-$60 per year is the best value around. They’re easy to setup: Both Dropbox and Carbonite are very easy to get setup and start using. I’ve never used Mozy, but I imagine it’s similarly painless to get up and running. Backups are automatically “off-site”: A backup that is sitting on an external hard drive right next to your computer is great, but might not protect against flood damage, a power surge, or other disasters in that single location. These services exist “in the cloud” so to speak, helping mitigate those concerns. Granted, this kind of backup scheme requires some trust in the 3rd party to protect your data from both malicious people and disastrous events. This truly is a bit of a double edged sword, but I sleep well at night knowing that my data is being backed up and secured by a company made up of engineers that focus on the business of doing backups right. Backups are “brainless”: What I like most about services like these is that they work “automagically” in the background, watching for files to be updated and automatically backing up those changes. There’s no need to remember to plug in that external drive and copy your data over. Since starting to recommend these services to my friends and family I find myself wearing my “data recovery” hat far less often. The only way backups are effective for your standard computer user is if they’re completely automatic. Backups need to be brainless, or they just won’t work.

    Read the article

  • Win 7 Remote Desktop connection failure when already logged in.

    - by Andy E
    I have a bit of a strange problem, magnified recently by my broadband dropouts. I wasn't sure whether to post this on SU or SF, so I thought I'd start here as more users would be likely to know what the problem is. In short, when I try and connect to my server (Windows Server 2008) from my laptop running Windows 7, I can only connect if my remote account was previously logged out. If I'm still logged in I get the error message: Windows cannot connect to the remote server. No explanation or anything. If my IP address is the same, I don't have this problem. If I boot up Windows XP Mode and run XP's remote desktop connection it works just fine -- I think the difference there is it takes me to the remote server's logon screen. With Win 7 RDC you never see the logon screen, it asks you for credentials before entering full screen mode. The real problem is that I'm having random broadband dropouts and my IP isn't static. If I logon via Win XP RDC, log out and then run Win 7 RDC then it works fine. I realize I can just use Win XP's RDC for now, but I don't really like keeping XP Mode open if I can help it. Does anyone know a way around this problem? Maybe forcing Win 7 RDC to go to the logon screen, or changing some server-side settings to work around the IP address issue?

    Read the article

  • Remmina remote control: black screen after XBMC exit

    - by Tinellus
    I have a HTPC (Quietpc Sidewinder Fanless media pc) running Ubuntu 12.04 and autostarting XBMC Frodo. I'm remote controlling this machine using my laptop also running Ubuntu 12.04 and using Remmina VNC as a client. Everything works perfectly as long as XBMC is running: I can see the remote screen and control via mouse and keyboard on the laptop. However, when XBMC is stopped on the HTPC, my TV shows the Ubuntu desktop normally, but the screen on the laptop turns black. I'm however still controlling the HTPC since I can see the arrow moving upon laptop mouse movement, and I can still type text in the HTPC. Oddly, when the screensaver on the HTPC kicks in, I again have a visual on the laptop. Anyone any thoughts on this? What should I do to maintain visual after stopping XBMC? Any suggestions much appreciated. Thanks!

    Read the article

  • Unable to wake display with remote

    - by Eugene
    I'm running an HTPC (xbmc) without a keyboard/mouse attached, running oneiric. After some indeterminate amount of time, sometime between 1 and 12 hours, the display goes to sleep. The computer itself is not sleeping, I can still SSH to it from another computer. The remote will not wake the display. The IR receiver is working, as irw will show me the remote key presses. The only way to get my display back is to restart the display manager, lightdm in this case. Does anybody know a way to keep the display from going to sleep? I don't really need any power management at all considering that it connects to my TV and when I want my display to go to sleep, I turn off my TV.

    Read the article

  • Is it worth it to switch from home-grown remote command interface to using JMX

    - by Sam Goldberg
    Without knowing too much about JMX, I've always assumed that it would be the best approach for building in remote management to our standalone Java server application. Our server application has some minimal remote control capability, using text commands sent via TCP/IP socket to it. Using the home grown approach, it is fairly to add a new command. (Just create new command text, and the code to handle that in the message receiver). On the other hand, we have hardly implemented any commands, even though there are many things we would like to be able to execute remotely. I am trying to weigh the value of moving to incorporating JMX (learning it, and building the interfaces), versus just sticking with the home-grown approach. Does anyone have any experience or advice regarding changing an existing application to use JMX?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >