Search Results

Search found 21331 results on 854 pages for 'require once'.

Page 189/854 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • Authentication in Apache2 with mod_dav_svn

    - by Poita_
    I'm having some trouble setting up authentication in Apache2 for a SVN repository that's being served using mod_dav_svn. Here is my Apache config for the directory: <Location /svn> DAV svn SVNParentPath /var/svn/repos AuthType Basic AuthName "Subversion Repository" AuthUserFile /etc/apache2/dev.passwd Require valid-user </Location> I can use svn with the projects under /var/svn/repos, so I know that the DAV is working, but when I do svn updates or commits (or anything), Apache doesn't ask for any authentication... It does the exact same thing whether the Auth directives are there or not. The permissions on the repository directory (and all subdirectories/files) only give permission to www-data (the Apache2 user/group). I have also ensured that all relevant modules are enabled (in particular mod_auth is enabled, as are all mod_dav* modules). Any ideas why svn commands aren't authenticating? Thanks in advance.

    Read the article

  • A simple Volume Replication Tool for large data set?

    - by Jin
    I'm looking for a solution to the following: Server A (Site A) - Win 2008 R2 - approx 10TB (15TB max) of data - well over 8 million files Server B (Site B) - Win 2008 R2 I want to assynchronously replicate Server A's volume to a volume on Server B for data redundancy. Something that I can say to my users, "go here for data" when/if Server A goes belly up due to machine problems, disaster, etc. Windows 2008 R2 does have DFS, but microsoft does not apparently support this large of a dataset (or more accurately, more than 8 million files - according to the docs I could find). I also looked at Veritas Volume Replication, but this seems almost too much as I would also require Veritas Volume Manager. There are numerous "back-up" software which makes a 1-1 backup, which would be ok, but since it will be transfering over internet, I'd like something that has compression during transfer like DFS has. Does anyone have any suggestions regarding this?

    Read the article

  • Incremental backup services with change only charges?

    - by wowowewah
    I'm looking for online backup services that provide incremental, change-only backups. I'm looking to transfer as little data as possible and would like to find a service that provides full backups every week along with incremental backups every day. Are there any specialist companies that deal with this or do I just use standard backup ones? Any recommendation appreciated. To expand on this Im looking for software/services which work on Unix. I guess Linux is fine aswell as FreeBSDs Linux compatibility layer should run it. Oh and command line would be ideal and not require the use of X Window. Thanks.

    Read the article

  • OpenVPN and PPTP on XEN VPS

    - by amiv
    I have Debian based system (Ubuntu 11.10) on XEN VPS. I've installed OpenVPN and works great. I need to install PPTP too, so did it and clients can connect, but they have no internet on client side. If I connect to VPN over PPTP I can ping and access to only my VPS by its IP, but ony that. There's no "internet" on client side. It looks it's not DNS problems (I'm using 8.8.8.8) because I can't ping known IPs. I bet the solution is simple, but don't have any idea. Any guess? /etc/pptpd.conf option /etc/ppp/pptpd-options logwtmp localip 46.38.xx.xx remoteip 10.1.0.1-10 /etc/ppp/pptpd-options name pptpd refuse-pap refuse-chap refuse-mschap require-mschap-v2 require-mppe-128 ms-dns 8.8.8.8 ms-dns 8.8.4.4 proxyarp nodefaultroute lock nobsdcomp /etc/ppp/ip-up [...] ifconfig ppp0 mtu 1400 /etc/sysctl.conf [...] net.ipv4.ip_forward=1 Command which I run: iptables -t nat -A POSTROUTING -j SNAT --to-source 46.38.xx.xx (IP of my VPS) The client can connect, first one gets IP 10.1.0.1 and DNS from Google. I bet it's iptables problem, am I right? I'm iptables noob and I don't have idea what's wrong. And here's the ifconfig and route command before client connect via PPTP: root@vps3780:~# route Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface default xx.xx.tel.ru 0.0.0.0 UG 100 0 0 eth0 10.8.0.0 10.8.0.2 255.255.255.0 UG 0 0 0 tun0 10.8.0.2 * 255.255.255.255 UH 0 0 0 tun0 46.38.xx.0 * 255.255.255.0 U 0 0 0 eth0 root@vps3780:~# ifconfig eth0 Link encap:Ethernet HWaddr 00:16:3e:56:xx:xx inet addr:46.38.xx.xx Bcast:0.0.0.0 Mask:255.255.255.0 inet6 addr: fe80::216:xx:xx:dfb6/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:22671 errors:0 dropped:81 overruns:0 frame:0 TX packets:2266 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:1813358 (1.8 MB) TX bytes:667626 (667.6 KB) Interrupt:24 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:100 errors:0 dropped:0 overruns:0 frame:0 TX packets:100 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:10778 (10.7 KB) TX bytes:10778 (10.7 KB) tun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 inet addr:10.8.0.1 P-t-P:10.8.0.2 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1500 Metric:1 RX packets:602 errors:0 dropped:0 overruns:0 frame:0 TX packets:612 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:90850 (90.8 KB) TX bytes:418904 (418.9 KB) And here's the ifconfig and route command after client connect via PPTP: root@vps3780:~# route Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface default xx.xx.tel.ru 0.0.0.0 UG 100 0 0 eth0 10.1.0.1 * 255.255.255.255 UH 0 0 0 ppp0 10.8.0.0 10.8.0.2 255.255.255.0 UG 0 0 0 tun0 10.8.0.2 * 255.255.255.255 UH 0 0 0 tun0 46.38.xx.0 * 255.255.255.0 U 0 0 0 eth0 root@vps3780:~# ifconfig eth0 Link encap:Ethernet HWaddr 00:16:3e:56:xx:xx inet addr:46.38.xx.xx Bcast:0.0.0.0 Mask:255.255.255.0 inet6 addr: fe80::216:xx:xx:dfb6/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:22989 errors:0 dropped:82 overruns:0 frame:0 TX packets:2352 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:1841310 (1.8 MB) TX bytes:678456 (678.4 KB) Interrupt:24 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:112 errors:0 dropped:0 overruns:0 frame:0 TX packets:112 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:12102 (12.1 KB) TX bytes:12102 (12.1 KB) ppp0 Link encap:Point-to-Point Protocol inet addr:46.38.xx.xx P-t-P:10.1.0.1 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1400 Metric:1 RX packets:66 errors:0 dropped:0 overruns:0 frame:0 TX packets:15 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:3 RX bytes:10028 (10.0 KB) TX bytes:660 (660.0 B) tun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 inet addr:10.8.0.1 P-t-P:10.8.0.2 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1500 Metric:1 RX packets:602 errors:0 dropped:0 overruns:0 frame:0 TX packets:612 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:90850 (90.8 KB) TX bytes:418904 (418.9 KB) And ugly iptables --list output: root@vps3780:~# iptables --list Chain INPUT (policy ACCEPT) target prot opt source destination Chain FORWARD (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere state RELATED,ESTABLISHED ACCEPT all -- 10.8.0.0/24 anywhere REJECT all -- anywhere anywhere reject-with icmp-port-unreachable ACCEPT all -- 10.1.0.0/24 anywhere ACCEPT all -- anywhere anywhere state RELATED,ESTABLISHED ACCEPT all -- 10.1.0.0/24 anywhere REJECT all -- anywhere anywhere reject-with icmp-port-unreachable ACCEPT all -- anywhere anywhere state RELATED,ESTABLISHED ACCEPT all -- 10.8.0.0/24 anywhere REJECT all -- anywhere anywhere reject-with icmp-port-unreachable And ugly iptables -t nat -L output: root@vps3780:~# iptables -t nat -L Chain PREROUTING (policy ACCEPT) target prot opt source destination Chain INPUT (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination Chain POSTROUTING (policy ACCEPT) target prot opt source destination SNAT all -- 10.8.0.0/24 anywhere to:46.38.xx.xx MASQUERADE all -- 10.1.0.0/24 anywhere SNAT all -- 10.1.0.0/24 anywhere to:46.38.xx.xx SNAT all -- 10.8.0.0/24 anywhere to:46.38.xx.xx SNAT all -- 10.1.0.0/24 anywhere to:46.38.xx.xx MASQUERADE all -- anywhere anywhere SNAT all -- anywhere anywhere to:46.38.xx.xx SNAT all -- 10.8.0.0/24 anywhere to:46.38.xx.xx MASQUERADE all -- anywhere anywhere MASQUERADE all -- 10.1.0.0/24 anywhere MASQUERADE all -- anywhere anywhere MASQUERADE all -- 10.1.0.0/24 anywhere As I said - OpenVPN works very good. 10.8.0.0/24 for OpenVPN (on tun0). PPTP won't work. 10.1.0.0/24 for PPTP (on ppp0). Clients can connect, but they haven't "internet". Any suggestions will be appreciated. Second whole day fighting with no results. EDIT: iptables -t filter -F - it resolved my problem :-)

    Read the article

  • PayPal integration woes: PDT hangs on return to site

    - by Tom
    Hi, I'm implementing PayPal IPN & PDT. After some headache & time at the sandbox, IPN is working well and PDT returns the correct $_GET data. The implementation is as follows: Pass user ID in form to PayPal User buys product and triggers IPN which updates database for given user ID PDT returns transaction ID when user returns to site The return page says "please wait" and repeat-Ajax-checks for the transaction status User is redirected to success/failure page Everything works well, EXCEPT that when using the PayPal ready PHP code for PDT to do a return POST, the page hangs. PayPal waits for a response and the user never gets back to my site. I'm not getting a fail status, just nothing. The funny thing is that once the unknown error occurs, my test domain becomes unresponsive for a short period. The code (PHP): https://www.paypal.com/us/cgi-bin/webscr?cmd=p/xcl/rec/pdt-code-outside If I comment out the POST back, it all works fine. I'm able to pin down the problem to once the code enters the while{} loop. Unfortunately, I'm not experienced enough to write a replacement from scratch for the PayPal code, so would really appreciate any ideas on what might be wrong. The POST back goes to ssl://www.sandbox.paypal.com, and I'm using button code and an authorisation token that have all been created via a sandbox test account. Thanks in advance.

    Read the article

  • How do I boot a virtual machine image from my network?

    - by Haabda
    I have many machines which require the same configurations. My goal is to boot them all from the network and load a virtual machine. It would be wonderful to have one image for all of our customer service machines. That way, I could load the virtual image, perform updates, and know the next time they boot up they will have all the changes. Ideally, the machines would store the image locally and only download a new image if there has been a change. With all the information out there on "desktop virtualization", "PXE booting", and "virtual machines", I feel lost. I have been reading for hours and feel like I have only just scratched the surface. I would like to do this using open source or free software. Any suggestions?

    Read the article

  • UPS for hard drive protection

    - by dimi
    I am in a place where electricity is not ideal (old house, no ground), sometimes it occasionally shuts down and supposedly there are some spikes. I consider using UPS with the goal to increase safety of my personal data. My first priority is the health of my internal and external USB hard drives which can be damaged due to possible power instability. I do not care that much about possible losses of not-saved work, instead I just want to let my system have a minimum time to turn off without any risk of physical damaging my hard drives. Would a cheap offline UPS suit my neads? Or do i need a better one with automatic voltage regulator (AVR)? How critical is AVR for the hard drives? The external ones require their own power supplies and will be plugged directly into UPS.

    Read the article

  • How can I stop Flash from leaving full-screen mode when it loses focus due to a mouse-click in the o

    - by therefromhere
    On a multi-monitor system, if I'm viewing a full-screen video in Flash on one monitor, clicking the mouse on the other monitor causes Flash to leave full-screen mode and revert to normal size. What's the easiest way of preventing this that works on my version of Flash? My system is Flash 10 (10.0.12.36), in Firefox 3.5 on Windows Vista 64, but I think it affects all current versions. This is very annoying behaviour, but unfortunately, according to this bug report response it seem to be a security feature, rather than a bug: We understand that many users would like fullscreen on one monitor and to be able to interact with your OS on another monitor. However, due to security requirements, we require that Flash and Browser must be the current focus of your OS.

    Read the article

  • Question about ubuntu untrusted source, gpg, keyserver

    - by ???
    I have mirrored the ubuntu archive repository (I must say it's rather huge). Then, I can apt-get install with no problem, but it prompts with following warning: WARNING: The following packages cannot be authenticated! xxxx, xxxx, ... Install these packages without verification [y/N]? Well, you can always install it. But, I can't install from the ubuntu software GUI. Which require trusted source. So, 1. How to force the GUI to install untrusted package? 2. Should I configure GPG to receive some public keys? (I've already installed ubuntu-keyring, debian-keyring, but it still untrusted) 3. Should I configure GPG to receive unknown keys from some specific keyservers, automatically?

    Read the article

  • Google chrome IETab login pages

    - by Jeff Storey
    Hi, I'm using Google Chrome and for certain sites I need to use IE. I've installed IE tab classic but I've noticed then when I have pages that require an active directory popup login that chrome will prompt me for the username/password and then send switch over to IE. IE will always show a message indicating that a connection to the page could not be made and I have to then press the "Refresh the page" link and then be prompted again for the username/password (this time inside IE) and then the the login will work. Does anyone know why this happens and how I can just login once? thanks, Jeff

    Read the article

  • Multisession burn in Imgburn

    - by blntechie
    Is Multisession burn available in Imgburn? If not, any idea whether it will be implemented in future? I almost recommended Imgburn instead of Nero or Roxio to one of my friend. He requires multisession burning and I found no options to enable it,if available in Options. Note: Please don't question the question. Like, Why would you want multisession anyway? or Isn't USB stick/RW Disk is what you need instead of a RO CD/DVD? Please keep the answers in context. I know that I can use USB sticks instead of CD/DVD and my friend require mulisession anyway. May be I can ask him to keep Nero as a backup for this purpose if Imgburn don't support this.

    Read the article

  • PDF to PNG Processor - Paperclip

    - by Josh Crowder
    I am trying to develop a system in which a user can upload a slideshow (pdf) and it'll export each slide as a png. After some digging around I came across a post on here that suggested using a processor. I've had a go, but I cant get the command to run, if it is running then I don't know what is happening because no errors are being shown. Any help would be appreciated! module Paperclip class Slides < Processor def initialize(file, options = {}, attachment = nill) super @file = file @instance = options[:instance] @current_format = File.extname(@file.path) @basename = File.basename(@file.path, @current_format) end def make dst = Tempfile.new( [ @basename, @format].compact.join(".")) dst.binmode command = <<-end_command -size 640x300 #{ File.expand_path(dst.path) } tester.png end_command begin success = Paperclip.run("convert", command.gsub(/\s+/, " "))) rescue PaperclipCommandLineError raise PaperclipError, "There was an error processing the thumbnail for #{@basename}" end end end end I think my problem is with the convert command... When I run that command by hand, it works but it doesn't give the details of each slide it just executes it. What I need to happen is once its made all the slides, pass back the data to a new model... or I know where all the slides are, but once I get to that point I'm not sure what todo.

    Read the article

  • Unable to connect my computer from LAN (http, smb) in UBUNTU 10.04

    - by Abdul Majeed
    I installed ubuntu 10.04, Apache, PHP, mysql, smb. Everything work fine in locally in my IP. When i trying to access my computer from LAN (other computer), it shows unable to connect. when i ping my IP from remote computer, its pinging OK. I can access internet, and all other systems (http, smb). But the problem is no one can't access my computer remotely in my LAN network. My ip is 192.168.85.105 and i want access(Appaceh,SMB) from 192.168.85.10. Is there any proxy firewall settings? I had tried following commands.. sudo iptables -F or sudo iptables-restore [logout require] If it does not work then try to disable net-filter sudo ufw --disable Please give me the solution.

    Read the article

  • PHPUnit: Testing if a protected method was called

    - by Luiz Damim
    I´m trying to test if a protected method is called in a public interface. <?php abstract class SomeClassAbstract { abstract public foo(); public function doStuff() { $this->_protectedMethod(); } protected function _protectedMethod(); { // implementation is irrelevant } } <?php class MyTest extends PHPUnit_Framework_TestCase { public function testCalled() { $mock = $this->getMockForAbstractClass('SomeClass'); $mock->expects($this->once()) ->method('_protectedMethod'); $mock->doStuff(); } } I know it is called correctly, but PHPUnit says its never called. The same happens when I test the other way, when a method is never called: <?php abstract class AnotherClassAbstract { abstract public foo(); public function doAnotherStuff() { $this->_loadCache(); } protected function _loadCache(); { // implementation is irrelevant } } <?php class MyTest extends PHPUnit_Framework_TestCase { public function testCalled() { $mock = $this->getMockForAbstractClass('AnotherClass'); $mock->expects($this->once()) ->method('_loadCache'); $mock->doAnotherStuff(); } } The method is called but PHPUnit says that it is not. What I´m doing wrong? Edit I wasn´t declaring my methods with double colons, it was just for denoting that it was a public method (interface). Updated to full class/methods declarations. Edit 2 I should have said that I´m testing some method implementations in an abstract class (edited the code to reflect this). Since I can not instantiate the class, how can I test this? I´m thinking in creating an SomeClassSimple extending SomeClassAbstract and testing this one instead. Is it the right approach?

    Read the article

  • Setting up Shibboleth to secure part of a website

    - by HorusKol
    I've installed the Shibboleth module for apache on Ubuntu 10.04 using aptitude to install libapache2-mod-shib2 as per https://groups.google.com/group/shibboleth-users/browse_thread/thread/9fca3b2af04d5ca8?pli=1 and enabled the module (I have checked in /etc/apache2/mods-enabled) I then proceeded to secure a directory on the server by placing a .htaccess file with the following directives: AuthType shibboleth ShibRequestSetting requireSession 1 Require valid-user Now - I haven't set up an SSL host yet - and I also haven't set up the IdP - but I would expect that the server would block access to this directory - but I'm getting the content without any problems. I have restarted the apache service and I have no errors in the log files.

    Read the article

  • Change Directory Browsing Page in IIS 7.5

    - by Gabriel Ryan Nahmias
    NOTE: This post is tagged ASP Classic but really that's just one of the languages in which I could write it. I really need assistance with configuring IIS (7.5). I have found many scripts and ideas to effect this but I require that it's not be a "drop-in" replacement, as in it must work globally for any possibly directory from one codebase. Here are several links related to this goal: http://mvolo.com/get-nice-looking-directory-listings-for-your-iis-website-with-directorylistingmodule: Best example of what I want and the one with which I can't seem to follow through. http://www.daleanderson.ca/edb/: This is an example of a "drop-in" replacement (at least it's oriented for that purpose). It still has viable code that could be useful to serve as the main file that processes directory traversal.

    Read the article

  • How can I preview various Google services in Firefox?

    - by Travis Christian
    With iGoogle shut down I haven't been able to replicate my homepage through other services. I'm not asking for a generic replacement for iGoogle, but how to accomplish a specific use case whether through a third-party dashboard, browser extension, or some DIY solution. I need an interactive Gmail inbox, Gmail Tasks, and Google Calendar in the same Firefox tab. If hosted by a third party, they need to authenticate safely using the Google account that I am logged in with. Other widgets would be useful but I only require those Google services. I'm using Gmail itself for now but there is hardly room for the other widgets in the sidebar. Both Google Sites and igHome won't reliably load the services, especially Tasks. Netvibes requires raw login information for third-party services.

    Read the article

  • Refreshing Facebook session from an iframe application

    - by zombat
    I've got a Facebook iframe application that is completely external. By this I mean that once a user accesses the canvas URL to load the application, all the links in the iframe app go to my servers, and the canvas page never gets refreshed unless the user navigates to somewhere else on Facebook and comes back (or does a browser refresh). On the initial load of the app where Facebook creates the iframe, I get passed all the usual parameters like fb_sig_user which allows me to create an internal app session based on the facebook user. This app session (which is not the Facebook session, it's my own app session) is all I need to allow the user to work with the app. The problem comes an hour later. If the user leaves the computer, or uses the app for more than an hour, the Facebook session expires. There are some app pages which require fetching friend information, and once the FB session has expired, these pages break, throwing out errors such as "Error: Session key invalid or no longer valid". My question is whether there is a way to refresh the user's Facebook session from within an iframe application to keep it from expiring an hour later. Do any of the API calls do this? Is there a Facebook Connect trick to ping something? Is there any definitive method to keep it alive? I haven't been able to find any examples that specifically address this.

    Read the article

  • MSBuild script fails but produces no errors

    - by Kate
    I have a MSBuild script that I am executing through TeamCity. One of the tasks that is runs is from Xheo DeploxLX CodeVeil which obfuscates some DLLs. The task I am using is called VeilProject. I have run the CodeVeil Project through the interface manually and it works correctly, so I think I can safely assume that the actual obfuscate process is ok. This task used to take around 40 minutes and the rest of the MSBuild file executed perfectly and finished without errors. For some reason this task is now taking 1hr 20 minutes or so to execute. Once the VeilProject task is finished the output from the task says it completely successfully, however the MSBuild script fails at this point. I have a task directly after the VeilProject task and it does not get outputted. Using diagnostic output from MSBUild I can see the following: My questions are: Would it be possible that the MSBuild script has timed out? Once the task has completed it is after a certain timeout period so it just fails? Why would the build fail with no errors and no warnings? [05:39:06]: [Target "Obfuscate"] Finished. [05:39:06]: [Target "Obfuscate"] Saving exception map [05:49:21]: [Target "Obfuscate"] Ended at 11/05/2010 05:49:21, ~1 hour, 48 minutes, 6 seconds [05:49:22]: [Target "Obfuscate"] Done. [05:49:51]: MSBuild output: Ended at 11/05/2010 05:49:21, ~1 hour, 48 minutes, 6 seconds (TaskId:8) Done. (TaskId:8) Done executing task "VeilProject" -- FAILED. (TaskId:8) Done building target "Obfuscate" in project "AMK_Release.proj.teamcity.patch.tcprojx" -- FAILED.: (TargetId:12) Done Building Project "C:\Builds\Scripts\AMK_Release.proj.teamcity.patch.tcprojx" (All target(s)) -- FAILED. Project Performance Summary: 6535484 ms C:\Builds\Scripts\AMK_Release.proj.teamcity.patch.tcprojx 1 calls 6535484 ms All 1 calls Target Performance Summary: 156 ms PreClean 1 calls 266 ms SetBuildVersionNumber 1 calls 2406 ms CopyFiles 1 calls 6532391 ms Obfuscate 1 calls Task Performance Summary: 16 ms MakeDir 2 calls 31 ms TeamCitySetBuildNumber 1 calls 31 ms Message 1 calls 62 ms RemoveDir 2 calls 234 ms GetAssemblyIdentity 1 calls 2406 ms Copy 1 calls 6528047 ms VeilProject 1 calls Build FAILED. 0 Warning(s) 0 Error(s) Time Elapsed 01:48:57.46 [05:49:52]: Process exit code: 1 [05:49:55]: Build finished

    Read the article

  • Reliable Backup Solution for Linux for Complete System Restoration

    - by Chris S
    What's the best backup solution for Linux that can completely restore the entire filesystem to a blank harddrive (including partitioning) after an old harddrive dies? I'm currently running a few Ubuntu machines, some with RAID-1 and others without RAID (mostly laptops). I'd like to implement a backup solution that can take incremental snapshots of the entire filesystem, so that if I were to replace all the harddrives in a machine, I could use the backup to restore a perfect copy of the previous filesystem. Unfortunately, nearly all the backup solutions I've found seem to be glorified rsync scripts, which only backup some files, and have no easy way to restore once the entire filesystem is gone. Some of the more complicated solutions, like Bacula, might do what I need, but require a complicated server/client setup and are notoriously difficult to maintain. I've heard that Apple's TimeMachine utility has this ability, and I've had similar success taking differential disk images with Acronis True Image on Windows, but of course neither of these work on Linux. Is there anything comparable for Ubuntu?

    Read the article

  • How to authenticate users in nested groups in Apache LDAP?

    - by mark
    I've working LDAP authentication with the following setup AuthName "whatever" AuthType Basic AuthBasicProvider ldap AuthLDAPUrl "ldap://server/OU=SBSUsers,OU=Users,OU=MyBusiness,DC=company,DC=local?sAMAccountName?sub?(objectClass=*)" Require ldap-group CN=MySpecificGroup,OU=Security Groups,OU=MyBusiness,DC=company,DC=local This works, however I've to put all users I want to authenticate into MySpecificGroup. But on LDAP server I've configured that MySpecificGroup also contains the group MyOtherGroup with another list of users. But those users in MyOtherGroup are not authenticated, I've to manually add them all to MySpecificGroup and basically can't use the nested grouping. I'm using Windows SBS 2003. Is there a way to configure Apache LDAP to do this? Or is there a problem with possible infinite recursion and thus not allowed?

    Read the article

  • Cheapest server to run Windows 2008 R2?

    - by chopps
    Hey Everyone, I want to build a really cheap server to use for testing etc but don't want to spend alot of dough. Any recomendations on what kind of home pc/server would work for these requirements? Any place to get refurbs at a good price? Component Requirement Processor Minimum: 1.4 GHz (x64 processor) Note: An Intel Itanium 2 processor is required for Windows Server 2008 for Itanium-Based Systems Memory Minimum: 512 MB RAM Maximum: 8 GB (Foundation) or 32 GB (Standard) or 2 TB (Enterprise, Datacenter, and Itanium-Based Systems) Disk Space Requirements Minimum: 32 GB or greater Foundation: 10 GB or greater Note: Computers with more than 16 GB of RAM will require more disk space for paging, hibernation, and dump files Display Super VGA (800 × 600) or higher resolution monitor Other DVD Drive, Keyboard and Microsoft Mouse (or compatible pointing device), Internet access (fees may apply)

    Read the article

  • How to prepare WiFi for an on-stage demo?

    - by Jeremy White
    Today at WWDC, Steve Jobs gave his keynote and ended up having a failure on-stage when connecting to WiFi. Google had a similar issue a few weeks ago in the same conference center. Please reference the following article for more information. http://news.cnet.com/8301-31021_3-20007009-260.html I am looking for information on how to best prepare a demo which uses a closed wireless network in front of a large audience. Note that the network will be closed, and will not require internet access. What steps can I take to prevent interference from existing WiFi, Bluetooth, etc? How can I best prevent curious/malicious people from trying to intrude on my WiFi network? I am open to recommendations on specific models of routers.

    Read the article

  • force https with apache before .htpasswd

    - by johnlai2004
    I have this in my .htaccess file RewriteEngine On RewriteCond %{HTTPS} off RewriteRule ^(.*)$ https://www.myweb.com/phpmyadmin$1 [R,L] AuthUserFile /var/www/myweb/.htpasswd AuthGroupFile /dev/null AuthName "Sovereign Databases" AuthType Basic <Limit GET> require valid-user </Limit> But everytime I go to http://www.myweb.com/phpmyadmin, the .htpasswd prompts me for a credentials BEFORE i'm redirected to https://www.myweb.com/phpmyadmin. After I type in my username and password, I get redirected to https://www.myweb.com/phpmyadmin. The problem is that I don't want anyone to submit their username and password unencrypted via http. How do I force people to login via the https version even if they typed in the http version?

    Read the article

  • Microsoft Outlook tips and tricks for improving user experience?

    - by Roee Adler
    I'm one of those heavy Microsoft Outlook users, currently working on the 2007 version. God knows this tool is heavy and may impose problems. I wondered what the Super User crowd has to suggest in order to improve the usage experience. Several suggestions of my own: Always work in cached mode (Tools--Account Settings--Change--Use Cached Exchange Mode) Use Outlook's local archiving capabilities Use Outlook's RSS reader - it's simple and allows offline access to your feeds If you have e-mail subscriptions to magazines, blogs, etc. - create a subdirectory to keep them, and a rule to automatically move them there when they arrive (one rule per subscription, based on the sender e-mail.) You can also share suggestions that require configuration of Exchange Server, for those of us who can make bring them to their IT managers. What are your suggestions? PS: "Use Gmail" is not an accepted answer, some of us don't control what email system we use...

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >