Search Results

Search found 11618 results on 465 pages for 'shared storage'.

Page 373/465 | < Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >

  • Free software for backing up an attached network drive

    - by Richard
    My wireless router comes with a USB connector which allows me to plug an external hard drive in and it'll act as a Network Attached Storage. The problem is that I want to backup this hard-drive to the external drive of another computer so that if the NAS drive fails, I don't lose everything. However, Windows 7 Backup refuses to include the NAS as a location to backup. I can't fool it by mapping it to a drive letter either. Google presents lots of pages on how to backup files to a NAS, but not the other way around. Can anyone advise me on free software which can do incremental backups of a NAS drive to an external drive attached the computer it is running on? I'm aware of this question but the top answers have one or more of the following issues: They aren't free. The free version cannot backup a NAS. They cannot do incremental backups. They're just a script and therefore have limited other functionality (eg. disk space management, scheduling, compression, etc.etc.)

    Read the article

  • Windows 7 BSOD when changing power plan

    - by dd5
    i have a strange problem. When i want to change the power plan on my laptop from High performance to Balanced, Windows freezes and i get bsod. The power plan settings are all default. Laptop specs: - Intel Core i3 330M/350M - Intel® HM55 Express Chipset - DDR3 1066 MHz SDRAM 8GB - ATI Mobility™ Radeon HD5730 1GB DDR3 VRAM - Intel SSD330 128gb - Windows 7 Home premium I've searched the internets but couldnt find a similar issue. BSOD first started when i installed this SSD and stopped when i've updated the chipset controller driver then started again yesterday when i wanted to change the power settings plan.Minidump file here. Any help with this weird issue appriciated, thanks. Edit: - i've ran Memory diagnostic tool, - Intel SSD diagnostics - and updated the firmware to 3.2.1. Non of these steps worked or shown signs of errors - but still got BSOD when changing power plan settings. After analizing the dump file via osronline.com here a first few lines: CRITICAL_OBJECT_TERMINATION (f4) A process or thread crucial to system operation has unexpectedly exited or been terminated. Several processes and threads are necessary for the operation of the system; when they are terminated (for any reason), the system can no longer function. Arguments: Arg1: 0000000000000003, Process Arg2: fffffa8008661b30, Terminating object Arg3: fffffa8008661e10, Process image file name Arg4: fffff800033de270, Explanatory message (ascii) -- Solution -- Provided by Vinayak: After installing the Intel Rapid storage Technology from MajorGeeks, i didn't experience a BSOD since, thank you :)

    Read the article

  • Port 22 is not responding

    - by Emanuele Feliziani
    I'm trying to make the jump to VPS from shared hosting for better performances and greater flexibility, but am stuck with the fact that I can't access the machine via ssh. First of all, the machine is a CentOS 6.3 cPanel x64 with WHM 11.38.0. Sshd is running (it appears in the current running processes). Making a port scan I see that port 22 is not responding. Port 21 is, but I am not able to access the machine via ftp (I think it's a security measure, but I don't know where to disable/enable it). So, I'm stuck in WHM and have no way to access the configuration of the machine, neither via ssh nor with ftp/sftp. When trying to connect with ssh via Terminal I only get this: ssh: connect to host xx.xx.xxx.xxx port 22: Operation timed out I also tried to access with the hostname instead of the IP address and it's the same. There seem to be no firewall in WHM and I have whitelisted my home IP address to access ssh, though there were no restrictions in the first place. I have been wandering through all the settings and options in WHM for several hours now, but can't seem to find anything. Does anybody have a clue as to where I should start investigating? Update: Thanks everyone. It was in fact a matter of firewall. There was a firewall not controlled by the WHM software. I managed to crack into the console from the vps control panel (a terrible, terrible java app that barely took my keyboard input) and disabled the firewall altogether running service iptables stop so that I was able to access the console via ssh with the terminal. Now I will have to set up the firewall again because the command I ran looks like having completely wiped the iptables. Can you recommend any newby-friendly resource where I can learn how to go about this and what should I block? Or should I just go with something like this: http://configserver.com/cp/csf.html ? Thanks again to everyone who helped me out.

    Read the article

  • Hosting a server for websites, ftp and random use at home?

    - by Zolomon
    I'm wondering what's the best option for me if I want to move all my hosted websites (from a hosting company) to a server at my own home? Basically, the needs I have are: be able to host websites using PHP/ASP.NET (haven't really decided yet - both would be preferred!) enable FTP so I can create accounts for my family members to access the server for file handling SSH SSL - for secure connections (this is something you have to buy/apply for per domain, not sure if there are any server side settings that have to be made) be able to stream video remote desktop host home-brew applications that can run as services use either MySQl/SQLite/SQL for relational database storage What should I think of before I buy a server? What hardware will I need, what will limit my server? I basically want to learn networking better as I'm a software and web developer but haven't had the resources to acquire any serious toys until now. At the time of writing, most of my websites have 60 visits/day so I don't suspect them to be very demanding. Is there something I haven't thought of that I should have? What OS would you suggest I run? FreeBSD vs Windows Server vs ?

    Read the article

  • Network speed between a VM and another machine which is not residing on the same host, is 11MB/s at most

    - by Henno
    Problem Network speed between a VM and another machine which is not residing on the same host, is 11MB/s at most. Topology Facts ESXi5 version is 5.0.0.504890 VM has the latest Vmware Tools installed VM is using E1000 network driver Physical box has Win Srv 2008 R2 as the OS CrystalDiskMark says the drive on physical box can read/write 100MB/s vCenter is another vm on esx both vm and physical box are showing 1Gbps link speed Configuration Networking shows vmnic0 as 1000 Full NTttcp is a client/server tool from Microsoft for measuring pure network throughput Here's what I've done so far: Test1: VM is running Filezilla FTP Server (default settings, one user account made) Physical box is running Filezilla FTP Client (default settings) Physical box is uploading a big file to FTP server Transfer speed (as observed by Windows Task Manager on both machines): ~11MB/s (bad) Physical box is downloading that file from FTP server Transfer speed (as observed by Windows Task Manager on both machines): still ~11MB/s (bad) Could it be disk performance issue? Test2: Physical box is running ntttcpr.exe -a 6 -m 6,0,VM_IP_ADDRESS VM is running ntttcps.exe -a 6 -m 6,0,PHY_BOX_IP_ADDRESS Transfer speed (as observed by Windows Task Manager on both machines): ~11MB/s (bad) Could it be switch performance issue? Test3: physical box is running vSphere Client I open Summary Storage datastore Browse Datastore... from physical box and upload a file to datastore Transfer speed (as observed by Windows Task Manager on physical box): ~26-36MB/s (good) Could it be a vm specific issue? Test4: Installed ntttcp to another vm on the same esx server Measured network performance between vms on the same esx server with NTttcp Transfer speed (as observed by Windows Task Manager on physical box): ~90-120MB/s (excellent :) Test5: I have another esx server on the same site, connecting to the same datastore and same switch. Those two ESX servers have both 2 NICs. One NIC goes to switch while the other goes directly to the other ESX server. vMotioned one of the testing vms off to the other ESX host Measured network performance between vms on different esx servers with NTttcp Transfer speed (as observed by Windows Task Manager on physical box): ~11MB/s (bad) While I'm aware of these: ESXi 4.1 slow file transfer ESXi 5 network performance is slow Debian Etch and ESXi slow network speeds VMWare ESXi slow file copy to guest they did not help (or I must have been missed something)

    Read the article

  • 2 servers on 2 networks in same office

    - by irot
    Hello Gents, My office doesn't have a "server guy" in employ, so I'm stuck with having to fix server issues for now. There are 2 servers in our office, both are file/web servers only accessible via LAN. They are currently on the same network, so no issue there. Problem is, we recently got a static IP to use, but it's with a different ISP, so now we have 2 routers in our office. I would like to open one of the servers to the public as a web/FTP server. But if I hook a server up to the new router, users will no longer be able to access the files shared on that server (because they're on different networks). How can I go about making one server accessible to the public using the static IP line, but still able to share the files on it to the users connected to the other network? The server I want to make public is running Windows Server 2008, the other server Windows Server 2003. And as far as I know, IP addresses are assigned by the router. I'm just a developer, don't know much about networking. Thank you in advance.

    Read the article

  • KVM and libvirt: How to configure a new disc device to an existing VM?

    - by initall
    I've got an Ubuntu 9.04 server running two VM's. In /etc/libvirt/qemu/machine1.xml two disk devices are defined like this: <devices> <emulator>/usr/bin/kvm</emulator> <disk type='file' device='disk'> <source file='/vserver/machine1/disk0.qcow2'/> <target dev='hda' bus='ide'/> </disk> <disk type='file' device='disk'> <source file='/vserver/machine1/disk1.qcow2'/> <target dev='hdb' bus='ide'/> </disk> I need more storage space in at least one of the devices and thought about adding a third hdc device by simply adding one with same style as above and re-organising my mount structure (The virtual sizes of the current qcow2 files are unfortunately limited.) My problem is that reloading libvirtd and restarting the VM do not result in a new visible device (checked with fdisk). I'm aware of extending an existing qcow2 file (converting to raw format, cat-ing/adding the new one, using smth. like gparted) - but only as a last resort. Hopefully it's something very simple I'm missing?

    Read the article

  • Including email, IMs, configs, etc. in documentation or notes

    - by Jason Antman
    The shop I work in is pretty laid-back. We're on a documentation kick, only because historically we've been very bad with it. We do a lot of our brainstorming in face-to-face meetings, and also do a lot of communication via IM in addition to email. While I'm usually pretty good about documentation and keeping copious lab notes, I just finished a build of a host and spent hours searching through IMs, emails, files on my workstation, etc. to pull out anything I missed in my lab notes, which formed a large amount of the basis for the internal documentation. Does anyone have any thoughts on, aside from manually saving things to a project directory, managing various data sources (especially email and IM) and tracking them on project basis? Ideally, I'd like an easy way to put copies of emails, IM logs, etc. into a project-specific directory on my workstation and then just have a cron job that syncs that up with a shared folder. This isn't really a candidate for anything more advanced, as the bulk of the data will be copies of configs, code, etc. Here are the big restrictions: Email is via a centralized Zimbra install, so nothing can happen server-side. My workstation is Linux. Aside from writing Pidgin and Thunderbird plugins that let me tag chats and emails as belonging to a project, and then copy them to the appropriate place... any thoughts? Suggestions? Thanks, Jason

    Read the article

  • Calendar booking issue - Exchange 2003 and 2010

    - by NaOH
    In our organization we are running Exchange 2003 and 2010 simultaneously, with the hopes of migrating everyone to Exchange 2010 sometime within the next few months. Everyone is using Outlook 2010. Recently, we had an issue with transaction log storage on the Exchange 2003 server. This was resolved, but for some reason no meeting rooms on the Exchange 2003 server will automatically book meetings any longer. I have played around with this for a while, changing calendar permissions, turning resource scheduling off and back on, etc. No dice. My next step was to try migrating a resource to the Exchange 2010 server. After doing so, and setting it up as a Room, enabling Auto-Accept and removing the EnableDirectBooking registry entry on my PC, I can book a meeting with this room. If EnableDirectBooking is enabled, I get an error message stating: "Meeting Room" declined your meeting because it is recurring. You must book each meeting separately with this resource. This is despite the fact that the meeting I'm attempting to create has no recurrence. Now, I have also created a new test Room from scratch on the Exchange 2010 server, and I can book a meeting with this Room regardless of whether or not I have the EnableDirectBooking reg entry in place. All users here have this registry entry, and I'd rather not have to figure out how to push something out to remove it from every PC. Rather, I'd like to figure out what's different between the configurations of these two meeting rooms so that I could book a meeting room regardless of whether EnableDirectBooking is enabled or not. Any ideas, anyone? Thanks!

    Read the article

  • About to go live: virtual dedicated server or cloud?

    - by morpheous
    I am about to launch my startup company, and we will be going live in a few weeks time. We have really tight budgetary constraints, since we are bootstrapping - and would prefer not to raise external capital. I cant use shared hosting because I need more control of the server machine (for technical reasons - e.g. using proprietary extensions to PHP, Apache and in the database layer as well) - but want to control costs and dont want to go fully private server route, until we have determined the market size etc. So the only real alternatives AFAIK is between virtual server and the cloud. At the moment, cloud services seem a bit "vague" to me. My understanding is that they allow an entity to outsource its IT infrastructure, which in my mind (at least), is indistinguishable from what a hosting provider provides (at least from a functional point of view) - I would like to seek some clarification on exactly what the difference between the two is. Back to my original question, my requirements are: IT infrastructure that can scale with growth Ability to have control of the machine (for e.g. to install our internally developed libraries etc) Backup software that is flexible and comprehensive enough (yet simple to use), that allows a (secured) backup strategy to be implemented. On this issue, I have always wondered where the actual backed up data was stored (since the physical machines are remote, and one cant get access to any actual tapes etc backed onto). I would also like some advice and recommendations in this area. Regarding data size, I am expecting the dataset to be increasing by a few megabytes of data (originally, say 10Mb, in about a years time, possibly 50Mb) every day. As an aside, I have decided to deploy on a Debian server (most of my additional libraries etc were compiled and built on a Debian machine). Mindful of all of the above, I would like some advice (and reason) as to which route to take. I would also like some advice on which backup software to use, from people who have walked a similar path.

    Read the article

  • How to create domain or router-level workgroup (dd-wrt micro)

    - by Anthony
    In Windows, is active directory required for using "Domain" instead of "workgroup"? Do I need to register a domain with a DNS provider like godaddy? What I really want to do is set up my home LAN so that everyone connecting to the main router (which is everyone, which is about 30 people) can see each other. I've tried having everyone use the same work group name, still hit or miss. I tried setting the domain name and host name on the router itself, still nothing. I've tried joining the domain name I set instead of work group, and I get an AD error. But ideally, everyone who is connected to the main router should simply just see each other and any shared folders. I've had this problem when I was not the network admin on other large LANs, and I've never been able to figure out why sometimes people disappear or never see each other. I'd really prefer using the native sharing functionality in the OS to setting up an internal FTP or Samba server, etc. Any sure-fire ways to fix this? (maybe an open source clone of AD?) Thanks!

    Read the article

  • Need Help getting perl module DBD::mysql installed for bugzilla on RedHat.

    - by Alos Diallo
    Hi everyone I am having some issues getting Bugzilla setup, I have the software on the server and am trying to get the pre-rec's setup. I am using RedHat 4.1.2-42. I have all of the required perl modules save one:DBD::mysql When I try: sudo perl install-module.pl DBD::mysql I get the following response(this is only an excerpt): rm -f blib/arch/auto/DBD/mysql/mysql.so LD_RUN_PATH="/usr/lib64/mysql:/usr/lib64:/lib64" /usr/bin/perl myld gcc -shared -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic dbdimp.o mysql.o -o blib/arch/auto/DBD/mysql/mysql.so \ -L/usr/lib64/mysql -lmysqlclient -lz -lcrypt -lnsl -lm -L/usr/lib64 -lssl -lcrypto \ /usr/bin/ld: skipping incompatible /usr/lib/libssl.so when searching for -lssl /usr/bin/ld: skipping incompatible /usr/lib/libssl.a when searching for -lssl /usr/bin/ld: cannot find -lssl collect2: ld returned 1 exit status make: * [blib/arch/auto/DBD/mysql/mysql.so] Error 1 /usr/bin/make -- NOT OK Running make test Can't test without successful make Running make install make had returned bad status, install seems impossible I then tried the following: CFLAGS="-I/usr/lib64/mysql:/usr/lib64:/lib64" perl install-module.pl DBD::mysql I get the same result. I have also tried to install it using CPAN but also get the same result. Right now I have DBD-mysql v3.0007 but need (v4.00) Also when I try to install open ssl it says I have the latest version. Does anyone know what I have to do to get this to work? Any help would be greatly appreciated. Thank you

    Read the article

  • outlook security alert after adding a second wireless access point to the network

    - by Mark
    Just added a Netgear WG103 Wireless Access Point in our conference room to allow visitors to access the internet through out internal network. When switched on visitors can connect to the intenet and everything works fine. Except, when the Access Point is switched on, normal users of the network get a Security Alert when they try to start Outlook 2007. The Security Alert is the same as the one shown in question 148526 asked by desiny back in June 2010 (http://serverfault.com/questions/148526/outlook-security-alert-following-exchange-2007-upgrade-to-sp2) rather than "autodiscover.ad.unc.edu" my security alert references our "Remote.server.org.uk". If I view the certificate it relates to "Netgear HTTPS:....", but the only Netgear equipment we have is the new Access Point installed in the conference room. If the Access Point is not switched on we do not get the Security Alert. At first I thought it was because we had selected "WPA-PSK & WPA2-PSK" Network Authentication Type but it continues to occur even if we opt for "Shared Key" WEP Data Encryption. I do not understand why adding a Netgear Wireless Access point would cause Outlook to issue a Security Alert when users try to read their email. Does anyone know what I have to do to get rid of the Security Alert? Thanks in advance for reading this and helping me out.

    Read the article

  • Password Authentication Fails - NTLMv2

    - by JMeterX
    Environment: Windows 2000 sp4 EDIT: Domain Controller with no trust setup with the Win2008 Server Windows XP machines Windows 2008 Server Netapp NAS Problem: We have a shared folder that resides on a NAS using a Windows 2008 AD for the authentication with the proper permissions setup. When the Windows 2000 machine tries to open the share residing on the Win2008 machine, it is prompted for a username and password. Upon entering the credentials it continuously re-asks for credentials. Important Details: The Windows 2000 machine can ping both the XP machines and the Windows 2008 Server The Windows 2008 machine is mandated to only use NTLMv2 The Windows 2000 machine was originally set to NTLM but was recently switched to NTLMv2 if negotiated for the purpose of trying to connect to the share. As I am sure it will come up, we are using Windows 2000 because of contractual obligations Questions: Why is password Authentication failing in this case? After setting a GPO for the Win2000 machine for it to use NTLMv2, do we need to reboot the machine for the changes to take affect? We used SECEDIT to update the GPOs without rebooting. UPDATE We checked both of the 2008 Domain Controllers to find an error code. We received: Microsoft_Auth_Package_V1_0 0xc000006a Event ID: 4776 I know this to be an authentication error via THIS article "The value provided as the current password is not correct" We know this password to be correct, but since these two domains (Win2000 & Win2008) do not have a trust setup what authentication account needs to be used? One that resides on the Win2000 hosted domain?

    Read the article

  • Understanding how IE's SmartScreen works

    - by Kevin Donn
    Today I downloaded an update to our mail server on my dev machine using IE9 on Win7 Pro. I directed IE to save the file on our server's shared drive so I could install it later. When the download finished, IE showed a red banner at the bottom and said that, ".exe is not commonly downloaded and could harm your computer." There were three buttons, "Delete", "Actions", and "View downloads". I selected "Actions" just because I had never seen this before. It showed a "SmartScreen Filter" dialog basically giving three choices: "Don't run this program (recommended)", "Delete program", and "Run anyway". I just canceled the dialog because I didn't want to run it in the first place; I just wanted to download it so I could run it later on the server. So when I did try to run it, it would blow up immediately saying, "Setup was unable to create the directory - Error 5: Access is denied." I tried unblocking the file, "Run as Administrator" even though I already was Administrator, turning off UAC, etc. Cutting to the chase, I finally downloaded the file again, ran WinMerge on the two and it showed they were identical, except the new one ran fine. I went back to my dev machine, downloaded the file through Firefox and then ran it on the server, again fine. But when I tried again through IE, again SmartScreen showed its red banner and somehow clobbered the file even though it was stored on another machine, and WinMerge can't tell the difference between it and a good file. I've looked around on the web for how SmartScreen works, but they all give user-level descriptions of it. What I want to know is, what does it do to that file to make it unrunnable on another machine? Thanks

    Read the article

  • How to find out what is causing a slow down of the application on this server?

    - by Jan P.
    This is not the typical serverfault question, but I'm out of ideas and don't know where else to go. If there are better places to ask this, just point me there in the comments. Thanks. Situation We have this web application that uses Zend Framework, so runs in PHP on an Apache web server. We use MySQL for data storage and memcached for object caching. The application has a very unique usage and load pattern. It is a mobile web application where every full hour a cronjob looks through the database for users that have some information waiting or action to do and sends this information to a (external) notification server, that pushes these notifications to them. After the users get these notifications, the go to the app and use it, mostly for a very short time. An hour later, same thing happens. Problem In the last few weeks usage of the application really started to grow. In the last few days we encountered very high load and doubling of application response times during and after the sending of these notifications (so basically every hour). The server doesn't crash or stop responding to requests, it just gets slower and slower and often takes 20 minutes to recover - until the same thing starts again at the full hour. We have extensive monitoring in place (New Relic, collectd) but I can't figure out what's wrong; I can't find the bottlekneck. That's where you come in: Can you help me figure out what's wrong and maybe how to fix it? Additional information The server is a 16 core Intel Xeon (8 cores with hyperthreading, I think) and 12GB RAM running Ubuntu 10.04 (Linux 3.2.4-20120307 x86_64). Apache is 2.2.x and PHP is Version 5.3.2-1ubuntu4.11. If any configuration information would help analyze the problem, just comment and I will add it. Graphs info phpinfo() apc status memcache status collectd Processes CPU Apache Load MySQL Vmem Disk New Relic Application performance Server overview Processes Network Disks (Sorry the graphs are gifs and not the same time period, but I think the most important info is in there)

    Read the article

  • Small store infrastructure - where to begin?

    - by KevinM1
    It looks like my older brother is about to change jobs - from lawyer to shooting range proprietor - and since I'm the family 'computer guy' I have the task of coming up with and setting up the in-store equipment. Only problem, I don't know how to start or where to look. I'm a web programmer, not an IT specialist. To that end, I figured I should ask the pros. Users: 3 (myself, my brother, and his business partner) Equipment: 1 Windows (likely 7) desktop for POS software, 1 Windows desktop/laptop for backroom use (bookkeeping, etc.) Other: ?? I'm looking for a reliable and, well, idiot-proof way to handle backups. Neither my brother nor his business partner are tech savvy (A web browser, email, MS Word and Excel are about the extent of their knowledge), so I need something they can handle. On-site would be preferable to off-site, given my brother's hesitance to have sensitive business data be handled by an outside source. I'm also looking for a small on-site server. I estimate that, at most, only 2-3 users will need access. A linux solution would keep costs down, but I'm concerned about Windows <- linux interoperability. Would the store security cameras' storage be handled by the security company, or would we have to stream that data to our own server? I know from my own experience with personal security that the company gives/loans a recording device to the home owner, but I'm not sure about business security. I know this sounds like a shopping list, and it's pretty vague. I wish I could give more detail, but between my own ignorance and things not being 100% nailed down on the business end, I'm a bit stuck. At the very least I'd like a nudge - links on a place to start, what to look for, things I need to think about, etc. - for this endeavor. Thanks.

    Read the article

  • Bacula virtual backup job doesn't run, no output?

    - by Zoredache
    I am trying to get Virtual Backups working, but when I try to run a virtual backup job, it appears to get created, but then never seems to actually run. I have a full, and a couple incremental backups. status director JobId Level Files Bytes Status Finished Name ==================================================================== 1283 Full 10,565 1.963 G OK 21-Dec-12 09:47 nms-Job 1284 Incr 314 129.6 M OK 21-Dec-12 09:49 nms-Job 1285 Incr 230 147.2 M OK 21-Dec-12 09:51 nms-Job 1288 Incr 525 138.8 M OK 21-Dec-12 11:25 nms-Job I attempt to start a job from bconsole like this. *run job=nms-Job level=VirtualFull Using Catalog "MySQL" Run Backup job JobName: nms-Job Level: VirtualFull Client: nms-FileDaemon FileSet: nms-FileSet Pool: nms-pool (From Job resource) Storage: File_d1 (From Pool resource) When: 2012-12-21 13:07:54 Priority: 10 OK to run? (yes/mod/no): Job queued. JobId=1291 Then my new job, just sits there, doing nothing. The JobStatus shows that the job was created, but it appears to never run? All the full, and incremental backups are terminating normally. *llist jobid=1291 JobId: 1,291 Job: nms-Job.2012-12-21_13.07.56_07 Name: nms-Job PurgedFiles: 0 Type: B Level: F ClientId: 4 Name: nms-FileDaemon JobStatus: C SchedTime: 2012-12-21 13:07:54 StartTime: 2012-12-21 13:07:56 EndTime: 0000-00-00 00:00:00 RealEndTime: 0000-00-00 00:00:00 JobTDate: 1,356,124,076 VolSessionId: 0 VolSessionTime: 0 JobFiles: 0 JobErrors: 0 JobMissingFiles: 0 PoolId: 19 PooLname: nms-pool PriorJobId: 0 FileSetId: 11 FileSet: nms-FileSet I am getting very frustrated, that this isn't working, mostly because it isn't giving me any error logs, or output at all. I submit the job, and as far as I can tell nothing happens. Is there some status, or debugging level that I can set to get a useful information about why this isn't working? What can I do to make this work? I was originally running Bacula 5.0.2 on Debian Squeeze, out of frustration, I upgraded to the 5.2.6 in the backports repository, hoping that a new version might give me better results.

    Read the article

  • Windows 7 ssh file server.

    - by Siriss
    Hello all- I have looked at the other posts, but have not quite found an answer I have a question about windows file sharing over SSH. I have copssh installed and it is working for Remote desktop connections. I have port 22 forwarded on my router etc. I connect from a Mac or Putty with this address: ssh -l copsshusername 3391:localhost:3389 [external ip] That works fine. I would like to configure Windows 7 to allow my ssh account that I use to login, access to certain shared folders. I have documents and videos and things that I would like to be able to download externally. I have done this before on Linux and a long time ago on XP, but I cannot figure out what I am missing on Windows 7. There is a designated SSH user that copssh uses to run the service and that I use to to login as. I have googled and googled and have not found a solution that does everything I need that is why I am turning here for ideas. I hope I am explaining this correctly. Thank you very much for your help!

    Read the article

  • How do I rename my old Program Files folder?

    - by SteveJ
    I installed a new SSD as my boot drive (C:), installed a fresh version of Windows 7 64-bit, and kept my existing SATA drive in the system (D:). I want to keep using my D: drive for file storage (no sense filling up the SSD with stuff that isn't performance critical) and I haven't formatted the D: drive because there's stuff on there I want to keep. I also want to create a new "D:\Program Files" folder so I can install apps that aren't performance-critical there. So I decided I'd rename the existing "D:\Program Files" from my old Windows install to "D:\Old Program Files" and then create a new "D:\Program Files" directory. Easy, right? I can see "D:\Program Files" just fine in Explorer. I right click, select Rename, and type "Old Program Files." I get the alert that says I need Admin permission to do this, so I press the confirm button with the shield. But the folder still appears as "Program Files" in Explorer. I jump out to the command line, and it appears as "Old Program Files" when I do a dir. I can even do mkdir "Program Files" and when I do a dir they both appear. But in the Explorer GUI, it looks like I have two "Program Files" folders. This will be confusing during app installation because I won't be able to tell which one is which. I've tried poking around in the properties tab of the old folder, but can't find anything that would explain what's causing the issue. How do I rename the old Program Files folder?

    Read the article

  • EMC VNX iSCSI setup - unsure about SP/port assignment

    - by pauska
    We have a new VNX5300 waiting to get configured, and I need to plan out the network infrastructure before the EMC tech arrives. It has 4x1gbit iSCSI per SP (8 ports in total), and I'd like to get the most out of the performance until we jump over to 10gig iSCSI. From what I can read from the docs - the recommendation is to use only two ports per SP, with 1 active and 1 passive. Why is this? It seems kind of pointless to have quad-port i/o-modules and then recommend to not use more than two of them? Also - I'm a bit unsure about the zoning. The best practices guide state that you should separate each port on each SP from each other on different logical networks. Does this mean that I have to create 4 logical networks to be able to use all 8 ports? It also gives the following example: Does this mean that A0 and B0 should sit on the same physical switch aswell? Won't this make all traffic go on one switch (if both A1 and B1 are passive)? Edit: Another brainpuzzle I don't get it - each host (as in server) should not have more iSCSI bandwidth available than the storage processor. What on earth does this matter? If serverA have 1gbit and serverB have 100mbit, then the resulting bandwith between them is 100mbit. How can this result in some kind of oversubscription? Edit4: Wait, what. Active and passive ports? The VNX runs in a ALUA configuration with asymmetrical active/active.. there shouldn't be any passive ports, only preferred ones..

    Read the article

  • Enabled Network Discovery on Server, and now VNC and Squeezebox clients don't work

    - by Mike Hanson
    I've recently setup a Windows Server 2008. It's running an email server, Squeezebox server, MS SQL Server, etc. I'm doing remote maintenance with UltraVNC. I had everything working fine. Then the server needed to access a network share on another machine, and I was prompted to turn on network discovery, which I did. I chose the Home rather than Public option. Since doing that, some things have stopped working, while others are still fine. Shared folders and the the Email services (ports 25 and 110) are still accessible. VNC (port 5900) and Squeezeboxes (port 9000) no longer work. Here's what I've tried to try to solve the problem: Checked the network discovery settings, to see if anything looked strange. Checked the firewall settings, and those ports appear to be open. Also in the firewall settings, the entries for Private domain Network Discovery were all on, but the Domain/Public ones were off. I tried turning those on. In the services, turned on Function Discovery Resource Publication and SSDP Discovery. Any other suggestions?

    Read the article

  • NFS confusion - writing many small files

    - by Antonis Christofides
    I have a Debian squeeze amd64 which is at the same time a NFS4 server and client (it mounts itself through NFS4). The local directory that leads directly to disk is /nfs4exports/mydir, whereas /nfs4mounts/mydir is the same thing mounted through NFS, using the machine's external IP address. Here is the line from fstab: 176.9.116.102:/mydir /nfs4mounts/mydir nfs4 soft 0 0 I have an application that writes many small files. If I write directly to /nfs4exports/mydir, it writes thousands of files per second; but if I write to /nfs4mounts/mydir, it writes 4 files per second or so. I can greatly increase speed if I add async to /etc/exports. (Writing a single large file to the NFS directory goes at more than 100 MB/s.) I am confused by the description of async in NFS. If my application accesses the local directory, system calls like write and close return even if caches have not been flushed to permanent storage. Apparently this is not true with NFS sync behaviour. However, with NFS async behaviour, even calls like fsync are ignored. Isn't it possible to work like local files, i.e. generally work asynchronously, but honour fsync and O_SYNC?

    Read the article

  • Own server, multiple website: most secure PHP setup

    - by plua
    Hi there, We have a company server with a variety of websites. They are maintained by different people from within our company. All websites are public. The server access is limited to our company only. This is NOT a shared hosting environment. We are looking into securing the server, currently analyzing the risk related to permissions of files. We feel the highest risk is when files are uploaded and then opened/executed by the public. This should not happen, but an error in a script might allow people to do so (there are image uploaders, file uploaders, etc). Uploader scripts use PHP. So the question is: what is the best way of setting / organizing permissions of files and processes? There seem to be several options to run PHP (and Apache), and setting the permissions. What should we take into consideration? Any tips? We are considering mod_php and FastCGI, but perhaps given our situation other solutions are preferred?

    Read the article

  • Cloning a NAS drive which hosts a SQL Server DB

    - by Adrian Hand
    We have a system in the field running a server application which is suffering with major performance issues. The system in question has 2 onboard 300gb sas drives in RAID 5 from which it boots Windows Server 2003, and a 6tb buffalo terastation NAS unit (also RAID 5) to which the server app does all of its reading and writing. I believe the terastation is the source of all our woes. Whilst under load, reads and writes tick by at something of the order of 1meg/sec, though the network in question is hardly utilised. The terastation contains various data, but crucially hosts a full instance worth of SQL Server .mdf and .ldf files (master etc - the whole shooting match) I wish to stop all the services on the server, then take everything on the terastation and essentially clone it to some alternative onboard storage, so as to eliminate the terastation from the equation as far as poor performance is concerned. ie the terastation is currently drive D: - I want to copy everything off and then have the duplicate assume the drive letter so that as far as the software is aware, nothing is different. This is tricky because of the mdf and ldf files - everything else will work with a straight up file copy. Can anyone suggest a means to achieve what I am describing? Many thanks!

    Read the article

< Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >