Running directly on command line the batch script works. But when scheduled to run (Windows 2003 R2 server) as the local administrator, I get the following error:
D:\ScadaExport\exported>ping 192.168.10.78
Pinging 192.168.10.78 with 32 bytes of data:
Reply from 192.168.10.78: bytes=32 time=11ms TTL=61
Reply from 192.168.10.78: bytes=32 time=15ms TTL=61
Reply from 192.168.10.78: bytes=32 time=29ms TTL=61
Reply from 192.168.10.78: bytes=32 time=10ms TTL=61
Ping statistics for 192.168.10.78:
Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
Minimum = 10ms, Maximum = 29ms, Average = 16ms
D:\ScadaExport\exported>net use Z: \\192.168.10.78\bar-pccommon\scada\
System error 67 has occurred.
The network name cannot be found.
Any ideas? Google is turning up nothing useful, just keep finding results relating to DNS etc, but using IP address here.
I am transferring a bunch (20+) of large (1GB+) files to my external flash drive over USB 2.0. Is it quicker to just sling them all over at once (as in one at a time but not waiting for the previous transfer to finish) so that there are multiple transfers going on, or transfer one, wait for it to finish, transfer the next. The files are coming from a variety of locations so I can't do one single big transfer.
Are there any other advantages to one way or the other that are worth considering?
I've recently been given the task to migrate about 200GB of data from one dedicated server to another. As this will take a week or more, I've been taking a snapshot of the current files on the FTP server using wget's mirror feature. However, since other users will probably be uploading / changing stuff in the meantime, the snapshot that I have made will not include the most recent changes.
Since I only have access to FTP on this server, I'm planning to write a script that will recursively do a FTP stat on all files in the FTP folder, and compare the directory listing against the snapshot I have locally. If there are differences in the number of files, then I know files have been added or deleted. If the modification dates have been changed, then I know the files have been changed, and should redownload those files specifically.
Am I missing anything in my approach, or are there any possible improvements to this approach?
When deploying an application with the Tomcat manager I get the following error:
FAIL - Failed to deploy application at context path /prademo
Tomcat log shows:
INFO: HTMLManager: install: Installing context configuration at '/home//webapps/PRA/META-INF/context.xml' from '/home//webapps/PRA' java.io.FileNotFoundException: /home/dstefan/webapps/PRA/META-INF/context.xml (Permission denied)
Permission to what? Both PRA and contex.xml have -rwxrwxrwx.
Thanks!
In SAS, How do I add comments to my .LST output file. Like adding a comment saying "This is the output for tbl_TestMacro:" right before doing a proc print? So that my output file will read:
This is the output for tbl_TestMacro:
Obs field1 field2
1 6 8
2 6 9
3 7 0
4 7 1
Instead of just:
Obs field1 field2
1 6 8
2 6 9
3 7 0
4 7 1
Thanks, Dan
What kind of executable files can run on windows xp-7?
I know of PE, but I don't know if there are any others.
I'm also interested in knowing different kinds of interpretive executables, like a java program and such. Thanks.
I'd also like to know what extensions they use, like PE uses .exe and .dll.
Hi,
as a SaaS provider with sensitive informations we think about crypted filesystem (under Linux) but is there any problem about performances or maintenance if the filesystem crash?
We want to use it on Mysql server for web application with medium load but high peeks of visitors.
Thanks,
Regards
Cédric
I've checked out Apple's Quick Look Programming Guide: Introduction to Quick Look page in the Mac Dev Center, but as a more of a science programmer rather than an Apple programmer, it is a little over my head (but I could get through it in a weekend if I bash my head against it long enough).
Does anyone know of a good basic Quick Look Generators tutorial that is simple enough for someone with only very modest experience with Xcode?
For those that are curious, I have a filetype called .evt that has an xml header and then binary info after the header. I'm trying to write a generator to display the xml header. There's no application bundle that it belongs to. Thanks!
Looking for a tool that would verify integrity of ALL files on a Windows 7 x64 NTFS disk reliably?
This is for testing of experimental defrag software, so it really needs to be secure and foolproof. I know it will take a long time, there's millions of files on the disk, but safety just cannot be compromised in a situation like this. Freeware solution much preferred.
Can be either Windows software (=inducing pitfalls about files changing due to booting Windows) or a stand alone boot (for example linux boot cd + usb key for storing chksum/metadata).
I have a new Windows 7 machine named PAP44 in the PAP workgroup. The networking is set to "Work" mode for the wired LAN.
I have a couple of users and I've shared a folder and set it so both users can read/write. Confusingly for me, rather than sharing just that folder (as I'm used to with older versions of Windows) it appears to be sharing a path (\\pap44\users\...\myFolder)
From another machine on the LAN, running XP, when I go to \\PAP44\Users I'm asked for a username and password, but neither of the usernames+passwords work. It just jumps back to the username and password dialogue, except that the username I entered gets prefixed with PAP44\
My end goal is to get my Debian/Ubuntu machines to be able to access this share, but first of all I thought I'd try to get it working in Windows, after all, that's supposed to be easy!
Is there another step? (PS. I am not a "hit and run" case!)
I hava file names like below
adn_DF9D_20140515_0001.log
adn_DF9D_20140515_0002.log
adn_DF9D_20140515_0003.log
adn_DF9D_20140515_0004.log
adn_DF9D_20140515_0005.log
adn_DF9D_20140515_0006.log
adn_DF9D_20140515_0007.log
i want get the year, Month, day from file name and create directories
Ex: [[ ! -d "$BASE_DIR/$year/$month/$day" ]] && mkdir -p "$BASE_DIR/$year/$month/$day";
How to achieve this and share the ideas/ script appreciate to you
How can I change the value of, let's say, PasswordAuthentication in /etc/ssh/sshd_config in commands?
As well, remove a # in front of the "key" I wish to value. These don't all have to be in one command. I setup quite a few servers, and remembering where everything is gets exhausting, so I want to get a series of commands I can copy paste and it does the work for me for future reference.
Sample values:
PermitRootLogin no
ChallengeResponseAuthentication no
PasswordAuthentication no
UsePAM no
UseDNS no
I'm pretty new to all the VMware world, so this is probably mainly a question about the right set of documentation to look at. I'm trying to clone/copy a VM that I installed on an ESXi installation.
I was trying to follow along with the top example here:
http://serverfault.com/questions/16320/is-there-a-way-to-clone-an-existing-vm-on-an-esxi-server-without-having-to-re-imp
However, I'm using the vSphere client to connect to the ESXi box and manage it, and the vSphere client is telling me it won't let me rename the vmdk file.
The real answer I want is how do I clone the VM I installed if I want to spin up 5 copies. Is there another utility I can use to copy the vmdk file, then create a new virtual machine using it? Any idea why they nerfed the feature in vSphere client?
I am running Ubuntu Desktop 12.04 with nginx 1.2.6. PHP is PHP-FPM 5.4.9.
This is the relevant part of my nginx.conf:
http {
include mime.types;
default_type application/octet-stream;
sendfile on;
root /www
keepalive_timeout 65;
server {
server_name testapp.com;
root /www/app/www/;
index index.php index.html index.htm;
location ~ \.php$ {
fastcgi_intercept_errors on;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
server {
listen 80 default_server;
index index.html index.php;
location ~ \.php$ {
fastcgi_intercept_errors on;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
}
}
In my hosts file, I redirect 2 domains: testapp.com and test.com to 127.0.0.1.
My web files are all stored in /www.
From the above settings, if I visit test.com/phpinfo.php and test.com/app/www, everything works as expected and I get output from PHP.
However, if I visit testapp.com, I get the dreaded No input file specified. error.
So, at this point, I pull out the log files and have a look:
2012/12/19 16:00:53 [error] 12183#0: *17 FastCGI sent in stderr: "Unable to open primary script: /www/app/www/index.php (No such file or directory)" while reading response header from upstream, client: 127.0.0.1, server: testapp.com, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "testapp.com"
This baffles me because I have checked again and again and /www/app/www/index.php definitely exists! This is also validated by the fact that test.com/app/www/index.php works which means the file exists and the permissions are correct.
Why is this happening and what are the root causes of things breaking for just the testapp.com v-host?
I have a local share with my vmware development server and I'm finding that when I create new files via osx, they are created under root instead of jacob.
Which is weird because when I do the connect to server thing I'm explicit about the user being jacob i.e.
afp://[email protected]/
Suggestions?
I have a zip 30GB zip file containing an archive of digital materials available in the school library that I want to burn to dvd. Of course, 30Gb is far too large for a single dvd and the content is already zipped. I'm open to ideas, but leaning towards suggestions that will help me automatically spread the file over multiple dvds, including a simple program to stitch it back together again later.
I've got an unresponsive Rackspace slice that has defied all attempts at accessing. I created an emergency image from this and deleted it, downloading the files that compromise the image to a local source. There are a number of files / assets I would still like to recover from this server if possible but not sure exactly what I can do with the image files, if anything.
Here's the files I have, for what its worth:
emergency_########_######_cloudserver########.tar.gz.0 (5gb)
emergency_########_######_cloudserver########.tar.gz.1 (5gb)
emergency_########_######_cloudserver########.tar.gz.2 (5gb)
emergency_########_######_cloudserver########.tar.gz.3 (50mb)
emergency_########_######_cloudserver########.yml (25kb)
Is it possible to mount this image as a drive? Are there other forensic recovery options?
Hello,
I searched here and couldn't find a similar issue to mine but apologies if I missed it. I've searched the web and no one else seems to be having the same issue either.
I'm running Windows 7 Ultimate 64bit on a pretty high spec. machine (well, apart from the graphics):
Asus M4A79T Deluxe
AMD Phenom II 965 black edition (quad core, 3.4GHz)
8GB Crucial Ballistix DDR 1333MHz RAM
80GB Intex X25 SSD for OS
500GB mechanical drive for data.
ATi Radeon HD 4600 series PCI-e
Be Quiet! 850W PSU
I think those are all the relevant stats, if you need anything else let me know.
I've updated chipset, graphics and various other drivers all to no avail, the problem remains. I have also unplugged and replugged every connection internally and cleaned the RAM edge connectors.
The problems:
Video LAN (VLC) and CDBurnerXP both take ages to load, I'm talking 30 seconds and 1 minute respectively which is really not right.
Copy and paste from Open Office spread sheet into Fire Fox, for example, is really, really slow, I'll have pressed control V 5 or 6 times before it actually happens, if I copy and then wait 5 to 10 seconds or so it'll paste first time so it's definitely some sort of time lag.
Command and Conquer - Generals: Zero Hour. When playing it'll run perfectly for about 10 or 20 seconds then it'll just pause for 3 or 4 then run for another 10 or 20 seconds and pause again and so on. I had the Task Manager open on my 2nd monitor whilst playing once and I noticed it was using about 25% of the CPU, pretty stable but when the pause came another task didn't shoot up to 100% like others on the web have been reporting (similar but not the same as my issue, often svchost.exe for them) but dropped to 2 or 3% usage then back up to 25% when it started playing properly again. Very odd!
But it gets even odder... I had a BSOD and reboot last week, when it rebooted the problem had completely gone, I could play C&C to my heart's content and both the other apps loaded instantly, copy and paste worked instantly too. I did an AVG update earlier this week which required a reboot, rebooted and the problem's back. I don't think it's AVG related though, I think it was just coincidence that's the app that required a reboot. I think any reboot would have brought the issue back. A number of reboots later and it hasn't gone away again.
If any one could make any suggestions as to the likely cause and solution to these issues I'd be most grateful, it's driving me nuts!
Thanks, Mike....
I tried to migrate a 5 year old ruby on rails application onto a new server with Ubuntu 8.04, Apache 2 and MySQL 5.
The application failed to run. When I looked in the error logs, I noticed
Errno::ENOENT (No such file or directory - /var/run/mysqld/mysqld.sock)
I looked around my new server but can't find a mysqld.sock file. How can I fix this problem?
I am using Picasa Photo Viewer and have associated all JPG and PNG files to open with it. However when I open an image from the Recent Items list in the Windows 7 Start Menu it opens with Windows Photo Viewer.
The context menu for such items reveals no actions that would make it go to Windows Photo Viewer, and the default (one in bold) opens with the Picasa viewer as you'd expect. It's just that the left click behaves differently for some reason.
Any ideas on how to fix this?
I realize this has been asked before and I have read as much as I could find on the topic but I still need help with this because there are so many different approaches and the ones I am trying aren't working.
So I have 2 routers, lets call them A and B. Both have a wireless feature and are active. A is in the basement and receives the internet. There is a TV on the ground floor that is connected to A through an ethernet wire. B is upstairs and gets the internet from A through an ethernet wire. Connected to B, is a desktop running Plex Media Server.
What I want to do is make sure devices connected to both routes can access the Plex Media Server.
So what I have read is that I should plug in the ethernet wire connecting B to a into a LAN port instead of the WAN port. After that I should turn off DHCP. I have tried this and B stops receiving internet. What am I doing wrong?
Another thing I have read is to use Router B in bridge mode but Router B is running openwrt and I have QoS on it so gaming/VoIP/browsing is unaffected by heavy downloading/uploading. I would prefer to keep this active. I realize it might be ineffective if a device in Router A is doing some hardcore downloading but all that stuff is done on Router B anyway so it doesn't matter. Router A can't get openwrt because it is a shitty one provided by Bell.
So, how do I proceed with this?
My question is pretty simple and is actually stated in the title. One of my applications throws errors regarding "too many open files" at me, even tho the limit for the user the application runs with is higher than the default of 1024 (lsof -u $USER reports 3000 open fds).
Because I cannot imagine why this happens, I guess there might be a maximum per process.
Any idea is very appreciated!
Edit: Some values that might help...
root@Debian-60-squeeze-64-minimal ~ # ulimit -n
100000
root@Debian-60-squeeze-64-minimal ~ # tail -n 4 /etc/security/limits.conf
myapp soft nofile 100000
myapp hard nofile 1000000
root soft nofile 100000
root hard nofile 1000000
root@Debian-60-squeeze-64-minimal ~ # lsof -n -u myapp | wc -l
2708
My friend has a Sony Ericsson Xperia (most probably X8) handset which is an Android based smartphone. He is not able to send files through bluetooth but able to receive. I don't exactly remember the version of Android which he is currently having. He has tried to download and install the required update using the same phone (he has internet connected) from http://www.sonyericsson.com but not able to install it. So, he asked me to help him. Which version of Android do you think he has? Does it seem like he does not have Android 2.1 installed. It is written here (click on Xperia X8) that
If your phone already has Android 2.1,
you can use your mobile network* or a
WiFi connection to download the
software.
Is it possible that he does not have updated Android and so not able to download? If so, does he need to upgrade to Android 2.1 first? Should it be done by connecting it to a PC?