Before, I can copy Windows XP CD contents to D:\windowsxp, then
Boot DOS CD
format C: /s /q
D:\windowsxp\i386\winnt.exe
How to install Windows 7 from MS-DOS? I found install from USB only.
I have an Ubuntu 10.04.4 LTS server with r1soft agent installed in it. Recently, the backups are failing with the following error.
--------
write error while sending code: Broken pipe
--------
I have reinstalled the buagent but to no avail. On checking the server logs, I could see the following errors listed in it:
--------
# tail -f /var/log/messages |grep -i buagent
Nov 17 03:35:06 microscope buagent: Need to back up 126 sectors
Nov 17 03:35:06 microscope buagent: (Righteous Backup Linux Agent) 1.79.0 build 12433
Nov 17 03:35:06 microscope buagent: allowing control from backup server (10.128.136.195) with valid RSA key
Nov 17 03:35:06 microscope buagent: allowing control from backup server (10.128.136.201) with valid RSA key
Nov 17 03:35:06 microscope buagent: sending auth challenge for allowed host at (10.128.136.201) port (47890)
Nov 17 03:35:06 microscope buagent: host (10.128.136.201) port (47890) authentication successful
Nov 17 03:35:06 microscope buagent: Backup request accepted. Starting backup.
Nov 17 03:35:06 microscope buagent: Snapshot completed in 0.010 seconds.
Nov 17 03:45:03 microscope buagent: Error reading blocks from snapshot.
Nov 17 03:45:03 microscope buagent: Reading blocks failed
Nov 17 03:45:03 microscope buagent: error backup aborted
Nov 17 03:45:03 microscope buagent: backup failed on agent closing connection
Nov 17 03:45:03 microscope buagent: Backup failed.
Nov 17 03:45:03 microscope buagent: write error while sending code: Broken pipe (32)
Nov 17 03:45:03 microscope buagent: tell child write failed
--------
I tried changing the 'Timeout' and 'DiskAsPartition' value in '/etc/buagent/agent_config' file but no luck. Also, verified that proper route is added to the backup server. The agent is also running fine.
Am I missing anything? Any help would be much appreciated.
Note: CDP 2.0 is installed in the backup server.
I'm using Google Apps for my domain's e-mail via IMAP. Whenever I send mail to a mailing list, I don't receive a copy of my own mail back in my inbox. According to Google, this is a "feature."
Is there a way to disable this "feature" so that all mail I send to mailing lists appears in my inbox just like all other e-mail?
Perhaps something along the lines of this method for disabling Google's spam filter??
I want to do
regular
automatic
backup of my vmware virtual machine (16GB big, Windows XP)
that is running
I do not have an access to ESX admin. I can ask our admin to set up something in the admin area but I do not have access for myself.
I have installed few programs that are important to me so I want to have working backup at any point of time.
Note:
I know I can copy all the files when the virtual machine is not up and running.
I work for a small state college. We currently have 4 ESXi hosts (all made by Dell), 2 EqualLogic SANs (PS4000 and PS4100) and a bunch of old HP Procurve switches. The current setup is very far from being redundant and fast so we want to improve it. I read several threads but get even more confused.
The Procurve Switches are 2824. I know they don't support Jumbo Frames and Flow Control at the same time, but we have plans to upgrade to something like Procurve 3500yl. Any suggestions? I heard Dell Powerconnects 6xxx are pretty good but I'm not sure how they compare to HPs.
There will be a 4-port Etherchannel (Link Aggregation) between the switches, and all control modules on SAN will be connected to different switches.
Is there anything that will make the setup better? Are there better switches then Procurves 3500yl that cost less than 5k? What kind of bandwidth can I expect between ESXi hosts (they will also be connected to 2824 with multiple cables) and SANs?
Looking at the ASUS web page as well as others, the Pentium D is not explicitly mentioned as supported on the ASUS P5Q Pro Turbo motherboard.
http://ca.asus.com/product.aspx?P_ID=c19zNYHCAXhCqBPq&templete=2
However many sites seem to indicate it is.
Is it supported after a BIOS update, or are they simply copy/pasting laundry lists of LGA775 CPUs?
Background
I have VirtualBox on Linux and recently the drive with the system files failed on me. I'm able to use the Live CD to view the files of the storage drive, and through this, I can copy my data files to another computer while I work on this one.
How do I load the .vdi in another computer (or in this case, the fresh install of Ubuntu)? I see many samples online of how to export it, but it assumes the host is currently working.
Thanks!
Do you know any small standalone and free tool, that can be run in console, to backup / restore ADAM / AD LDS database files (like adamntds.dit, edbres00001.jrs etc.).
I tried to stop ADAM service and copy / paste these files to other location but afterwards I was unable to restore ADAM from these files.
I know I could use on ws 2003 some backup tool that was provided by microsoft but it seems to be unavailable on ws 2008.
Hello everyone,
I am using scp command to copy file from a MacBook Pro OS X 10.5 to another Linux box (Red Hat Linux Enterprise 5).
I am using the following command on Mac, sudo scp ~/.ssh/mykey.rsa [email protected], there is no output from Mac command line. I am not sure whether the scp is success or not. Where is the location the file mykey.rsa on remote computer 10.10.100.101?
thanks in advance,
George
I just read the Linux scp command issue question and it reminded me that I regularily forget to specify the colon in the host part of a scp command, and thus copying a file locally instead of copying to a remote host, e.g. I do
scp foo host
instead of
scp foo host:
But I never use scp to copy a file locally. So I wonder if there is a way to make scp fail if both (the source and destination) arguments refer to local files.
I have several shares residing on a samba server in a small business environment that I would like to provide search facilities for. Ideally this would be something like google desktop with some extra features (see below), but lacking this the idea is to take what I can get, or at least get an idea for what is out there.
Using google desktop search as a reference model, the principle additional requirement is that it is usable from clients over the network. In addition there are some other notes (note that none of these are hard requirements)
The content is always files, residing on a single server, accessible from samba shares.
Standard ms office document fare
Also a lot of rars and zips which it is necessary to search inside.
Permissions support, allowing for user-based control to reflect current permission access in samba shares.
The userbase will remain fairly static, so manual management of users is fine.
majority of users will be Windows based
I know there are plenty of search indexers out there: beagle and tracker seem to be the most popular. Most do not seem to offer access control and web-based/remote search does not seem to be high priority. I've also seen a recent post on the samba mailing list asking for pretty much the exact same thing. (They mention a product called IBM OmniFind Yahoo! Edition and while their initial reception seems positive, I am pretty skeptical. RHEL 4? Firefox 2? Updated much?)
What else is out there? Are you in a similar situation? What do you use?
I got a certificate from my network administrator and the passphrase for it. Put everything in the Tunnelblick configuration folder, but always get an error:
2010-11-20 13:22:10 Cannot load private key file vpn-pass.key: error:06065064:digital envelope routines:EVP_DecryptFinal:bad decrypt: error:0906A065:PEM routines:PEM_do_header:bad decrypt: error:140B0009:SSL routines:SSL_CTX_use_PrivateKey_file:PEM lib
Everything was copy&paste and it works on a windows machine. How can I get this to work?
I have a website that works perfectly with Chrome & other browser but i get some errors with PHP in CLI mode so i'm investigating it, running this:
openssl s_client -showcerts -verify 32 -connect dev.carlipa-online.com:443
Quite suprisingly my HTTPS appears untrusted with a Verify return code: 27 (certificate not trusted) Here is the raw output :
verify depth is 32
CONNECTED(00000003)
depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA
verify error:num=27:certificate not trusted
verify return:1
depth=1 C = US, O = "GeoTrust, Inc.", CN = RapidSSL CA
verify return:1
depth=0 serialNumber = khKDXfnS0WtB8DgV0CAdsmWrXl-Ia9wZ, C = FR, O = *.carlipa-online.com, OU = GT44535187, OU = See www.rapidssl.com/resources/cps (c)12, OU = Domain Control Validated - RapidSSL(R), CN = *.carlipa-online.com
verify return:1
So GeoTrust Global CA appears to be not trusted on the system (Ubuntu 11.10). Added Equifax_Secure_CA to try to solve this... But i get in this case Verify return code: 19 (self signed certificate in certificate chain) !
Raw output :
verify depth is 32
CONNECTED(00000003)
depth=3 C = US, O = Equifax, OU = Equifax Secure Certificate Authority
verify error:num=19:self signed certificate in certificate chain
verify return:1
depth=3 C = US, O = Equifax, OU = Equifax Secure Certificate Authority
verify return:1
depth=2 C = US, O = GeoTrust Inc., CN = GeoTrust Global CA
verify return:1
depth=1 C = US, O = "GeoTrust, Inc.", CN = RapidSSL CA
verify return:1
depth=0 serialNumber = khKDXfnS0WtB8DgV0CAdsmWrXl-Ia9wZ, C = FR, O = *.carlipa-online.com, OU = GT44535187, OU = See www.rapidssl.com/resources/cps (c)12, OU = Domain Control Validated - RapidSSL(R), CN = *.carlipa-online.com
verify return:1
Edit
Looks like my server does not trust/provide the Equifax Root CA, however i do correctly have the file in /usr/share/ca-certificates/mozilla/Equifax...
Here's a quick summary of the environment I support: we have a domain (domain A) that has about 20 client computers. The domain server for this domain and all the clients sit within the network infrastructure of a larger domain (domain B). All the computers get their network settings via DHCP from domain B's servers. I have no control and am unable to make changes to anything to do with domain B.
The problem I have is that currently in order for my domain's (domain A) clients to be able to resolve the domain server and the shares on it they have their DNS server IP address set to domain A's domain server (via the default GPO). Unfortunately when a laptop (windows and mac) gets taken home, they are still looking for the domain server as their DNS server and obviously can't access the internet correctly outside of our environment. Ideally I need a solution where the machines use domain A's domain server as their DNS when inside the office and use what ever DNS server DHCP gives them when they are outside the office. However, since I have no control over the office DHCP server, I'm not sure how this can be accomplished.
Any help and advice that anyone can offer is highly appreciated.
Thanks,
Harry
P.S. The solution I'm trying to find needs to require no involvement from the user.
I have two windwos 2008 standard servers running DFSR okay. I can create a file on one server, it is replicated to the other okay etc. I have the namespace shared folder on each server shared with full control administrators / everyone change/read permissions.
I then browse to the folder on server 1 e.g.\server1\namespace\share\folder1.
I right click the folder, and configure the NTFS permissions as I would like for example Adminsitrators Full Control / One User Read/Write Access / No other users in the user list.
I save this and then double check the second server e.g. \server2\namespace\share\folder1.
I right click the same folder name as before and can see the NTFS permissions have replicated accordingly.
I right click the folder and go to properties - security - advanced - effective permissions and select a user that shouldnt be able to get into that folder e.g. testuser. It agrees with the NTFS permissions and shows that testuser has no ticks next to any permissions so should be denied access.
I logon to any network PC or the server as testuser. Browse to \server1\namespace\share\folder1. It lets me straight in, no access denied messages. The same applies to server2.
It seems as thought all my NTFS permissions are being ignored. I have 1 DFS share and then all the subfolders are a mixture of private folders and public folders so need the NTFS permissions to work ideally. Any idea whats going on? Is this normal? From my tests all users can access any DFSR folder under the namespace\share which is quite worrying.
Thanks
Some PDF files produce garbage ("mojibake") when you copy text. This makes it impossible to search them (whatever you search for will not match the garbage). Does anyone have an easy workaround?
An example: TEAC TV manual EU2816STF
BTW: I am using Adobe Reader - perhaps an alternative viewer might help?
I use to work with 2 laptops (vista and win7), my work being files on an external usb disk.
My oldest laptop broke down, so I bought a new one.
I had no option other than take win8.
1/ I suspect something changed with access rights, as my external disk suffered some "access denied" problem on win8.
I was prompted (by win8) somehow to fix the access rights, which I tried to do, getting to the properties - security. This process was very slow and ended up saying "disk is not ready".
Additonnally, the usb somehow was not recognized anymore.
2/ Back to win7, I was warned that my disk needed to be verified, which I did.
In this process, some files were lost (most of them i could recover from the folder found00x, but I have some backup anyway).
Also, I don't know why, but under win7, all the folder showed with a lock.
3/ Then back again to win8.
Same problem : access denied to my disk + no way to change access rights as it gets stuck "disk is not ready".
Now I am pretty sure there is some kind of bug or inconsistence in win8 / win7.
I did 2/ and 3/ a few times.
At some point, I also got an access denied in win7.
I could restore access rigths to the disk to "system" (properties - security - EDIT for full control to group "system" ...).
But then I still get the same access right pb on win8, and getting stuck in the process to restore full control to "system" -- and "admin" groups.
Now, after I tried for more than 3 days, I am losing my patience with that bloody win8 which I did not want to buy but had no choice.
I upgraded win8 with the windows updates available. Does not help.
Anybody can help me ?
I'm the lead developer in a team of 2. My partner has only just joined the project and despite using GIT for version control etc, we are still stuck in the dark ages when it comes to code deployment.
Currently I make all site updates via FTP (this way I have control / responsibility over everything that goes live), using Filezilla.
I've done this for years, but we now have some large PHP classes (300KB), and a lot of traffic.
So in short, every time I upload a key class "general" for example, the site goes down until the file finishes uploading. This is only 5/6 seconds at a time, but this is increasingly unacceptable.
I realise I can upload the file under a different name and then rename both files... but really there must be a better way?
I've heard about rsyncing code across from another server, but I don't see how this prevents switching to the new file whilst uploading.
We only have one server (for DB and Apache) but also use some cloud servers (for openx as an example).
There are times after I eject a USB Flash drive, I want to copy some more files over to the USB Flash drive. In this case, do I always need to unplug the drive and plug it back in? Is there a way to "reconnect" or "un-eject" the drive?
To eject, that are two ways:
1) Right Click the drive (say H:) and choose Eject
2) Click "Safely remove hardware" from the icon tool
But there seems to be no way to un-eject or reconnect a drive.
I'm working with a client who wants me to implement a particular design in an IIS/ASP.NET environment. This design was already implemented in Java, but I am not sure it is possible using Microsoft technologies.
In a Tomcat/Java environment one can create so call Handler Chains. In essence a handler runs on the server on which the web service is running and it intercepts the SOAP message coming to the web service. The handler can perform a number of tasks before passing control to the web service. Some of these tasks may refer to authentication and authorization. Moreover, one can create handler chains, such that the handlers can run in a particular sequence before passing control to the web service.
This is a very elegant solution, as certain aspects of authentication and authorization can be automatically performed, without the developer of the client application and of the web service having to invest anything in it. The code for the client application and the web service is not affected.
You may find a number of articles on internet on this subject by searching on Google for "web service handler chain".
I performed searches for web service handlers in IIS or ASP.NET. I get some hits, but apparently handlers in IIS have another meaning than that described above.
My question therefor is: Can handler chains (as available in Java and Tomcat) be created in IIS? If so, how (any article, book, forum...)?
Either a negative or a positive answer will be greatly appreciated.
Mike
Here is the current situation:
My cousin deleted Windows from his hard drive (yeah, don't ask...). His hard drive still has about 200 GB of files on it that he may want to recover before we format the hard drive and reinstall Windows 7 to it. Is there a way I can create a bootable CD from some utility that will allow me to access the files on the hard drive, and copy it to a flash drive? What's the best utility for that?
I have a remotely hosted (virtual, VMware) dedicated server (Windows 2008 Server Web edition w/ SP1) that I can only connect to over Remote Desktop. Lately, a process hogs CPU for ~40 minutes most every day (at a random hour) and brings all web sites on the server down. While this is going on I also cannot connect using Remote Desktop to investigate on what is that process... Promptly after 40 min I can RD and the first thing I see on the Perf Monitor is that there was something topping the CPU at 100% and stops just before I'm able to RD... I'm aware of the beginning and end of this for I have monitors setup that email me up/down status of the web sites but I'm locked out while this is happening - can't RD to the server until it's over (and too late to see the Task Manager/Process Explorer picture).
What is the best way/tool to setup on the server to continuously monitor all processes so when this happens I login and "replay" it to find the process causing this trouble?
(I have no control over the virtual/VMware setup for it is hosted by a 3rd-party but I have most full control over my dedicated machine)
Thanks in advance!
I want to copy tcp traffic. i want to use these commands
" iptables -A PREROUTING -t mangle -p tcp --dport 7 -j ROUTE --gw 1.2.3.4 --tee
iptables -A POSTROUTING -t mangle -p tcp --sport 7 -j ROUTE --gw 1.2.3.4 --tee"
like stated here
http://stackoverflow.com/questions/7247668/duplicate-tcp-traffic-with-a-proxy
but iptables keeps telling me "iptables v1.4.8: unknown option '--gw'"
What can I do to fix this?
With Kind Regards
Hi,
What do I need to do to deploy Titanium app into a jailbroken iPhone?
I can deploy an XCode app into a jailbroken iphone by creating my own certificate.
I tried packaging the KitchenSink app, but it came up with packaging error.
I assume if I can get pass the packaging stage, I can just simply copy and paste the app file into my iPhone (following the previous iPhone guide)
THanks
I'm trying to get a PPTP server running on a ubuntu server, but I've run into some issues with it. I followed this guide on how to set up pptpd on my server, and everything went smooth, but when I try to connect with my mac, it gives me this error:
Here's my configuration:
Does anyone have any idea as to what I'm doing wrong here?
Update: Here's what the pptpd.log has to say about it:
steve@debian:~$ sudo tail /var/log/pptpd.log
sudo: unable to resolve host debian
Sep 3 21:46:43 debian pptpd[2485]: MGR: Manager process started
Sep 3 21:46:43 debian pptpd[2485]: MGR: Maximum of 11 connections available
Sep 3 21:46:43 debian pptpd[2485]: MGR: Couldn't create host socket
Sep 3 21:46:43 debian pptpd[2485]: createHostSocket: Address already in use
Sep 3 21:46:56 debian pptpd[2486]: CTRL: Client 192.168.1.101 control connection started
Sep 3 21:46:56 debian pptpd[2486]: CTRL: Starting call (launching pppd, opening GRE)
Sep 3 21:46:56 debian pptpd[2486]: GRE: read(fd=6,buffer=204d0,len=8196) from PTY failed: status = -1 error = Input/output error, usually caused by unexpected termination of pppd, check option syntax and pppd logs
Sep 3 21:46:56 debian pptpd[2486]: CTRL: PTY read or GRE write failed (pty,gre)=(6,7)
Sep 3 21:46:56 debian pptpd[2486]: CTRL: Reaping child PPP[2487]
Sep 3 21:46:56 debian pptpd[2486]: CTRL: Client 192.168.1.101 control connection finished
My pptpd options are:
asyncmap 0
noauth
crtscts
lock
hide-password
modem
debug
proxyarp
lcp-echo-interval 30
lcp-echo-failure 4
nopix