Search Results

Search found 16797 results on 672 pages for 'directory traversal'.

Page 479/672 | < Previous Page | 475 476 477 478 479 480 481 482 483 484 485 486  | Next Page >

  • mod_rewrite issue | Request exceeded the limit of 10 internal redirects

    - by Chris Anarko Meow
    ok what Im doing normally works but since my rule "includes" itself is giving me issues and can't find a solution after hours working on different options. I have a .htaccess with: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} !^/3.15.0/(.*) RewriteRule ^(.*)$ /3.15.0/$1 [L] this is for my software versions, I have a program that can request sometimes versions that are updated and in the server may be behind a couple version so I want to be able to say that whatever is comming in forward to the latest version that in this example is 3.15.0 /var/www/nameblabla/3.15.0 my .htaccess is on /var/www/nameblabla/.htaccess so the first Condition is to ignore request that already has the right path and version.. the second should be to grab all request and forward to 3.15.0... and of course not loose the path to the files I want from inside that should be the same. so far I can only get it to redirect to such directory but will loose the path and others I get the "Request exceeded the limit of 10 internal redirects" I guess this is because Im including the 3.15.0 path Any help or another way to do this with out mod_rewrite?

    Read the article

  • Run Grunt task in Visual Studio Release Build with a bat file

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2014/08/19/run-grunt-task-in-visual-studio-release-build-with-a.aspx 1. Add a BeforeBuild in your csproj file. Edit the xml with a text editor. <Target Name="BeforeBuild"> <Exec Condition="'$(Configuration)' == 'Release'" Command="script-optimize.bat" /> </Target> 2. Create the script-optimize.batREM "%~dp0" maps to the directory where this file exists cd %~dp0\..\YourProjectFolder call npm uninstall grunt call npm uninstall grunt call npm install --cache-min 604800 -g grunt-cli call npm install --cache-min 604800 grunt typescript requirejs copy less:compile less:mincompileThis grunt command will compile typescript, run the requireJs optimizer, complie and minimize less.3. Make it use the minified code when the Web.config compilation debug is set to false <!-- These CustomCollectFiles actions are used so that the Scripts-Release folder/files are included        when publishing even though they are not project references -->  <Target Name="CustomCollectFiles">    <ItemGroup>      <_CustomFiles Include="Scripts-Release\**\*" />  </ItemGroup>  </Target> That should be all you need to get a Grunt task to minify and combine JS (plus other tasks) in Visual Studio Release build with debug = false. This is a great video of Steve Sanderson talking about SPAs, npm, Knockout, Grunt, Gulp, ect. I highly recommend it.

    Read the article

  • Grant user from one domain permissions to shared folder in another domain

    - by w128
    I have two computers set up like this: \\myPC (local Windows 7 SP1 machine); it is in domain1; \\remotePC (Win Server 2008 with SQL Server - a HyperV virtual machine); it is in domain2. In domain2 active directory, I have a user account RemoteAccount. I would like to give this account full permissions to a shared folder located on \\myPC, i.e. folder \\myPC\SharedFolder. The problem is, when I right-click the folder and go to sharing permissions, I can't add permissions for the domain2\RemoteAccount user, because this user cannot be found - I can only see domain1 users. When I click 'Locations' in "Select users, computers, service accounts, or groups" dialog, I only see domain1. Is there a way to do this?

    Read the article

  • How can I get a user account back?

    - by Ilan
    With all my computers I make one partition for the root and another for /home. This is useful for disasters where I need to reformat the root for ubuntu, but leave my /home data untouched. With the upgrade to 13.10 I had troubles on my wife's computer so I reinstalled 13.10. My own /home files came up, as expected, as if nothing had happened. For my wife, it is a different story - and that is the part where I need help. If I go into Files, computer I can see the home directory. There I can see ilan (my files) and yona (my wife's files). I can open yona, documents and see all her work. This means that all is well and I just need to hook up to her files. So the problem is that I need to create a user called Yona or yona, but something which will get me to exactly the files of interest. I'm not sure if I created her account as standard or an administrator. Is there any way I could tell by looking at the files in /home? I created a new user called Yona as a standard user (hoping that this is the right guess). The account came up as disabled. I pressed on the disabled button so I could change the password. I put in her password but it was refused as too short. Too short, too short, but that is what was used and that is what I need. Can anyone help me before my wife comes home and shoots me? Thanks, Ilan

    Read the article

  • Best client and server antivirus for 5 user office?

    - by drpcken
    I'm setting up an Active Directory environment for 5 users (very small) and I'm wondering what is the best antivirus for clients (Windows 7) and servers (Server 2008 R2 x64)? I use Symantec Corp at my organization (50+ users) but I think that is overkill for this company. I wanted to use Microsoft Security Essentials for the clients (I use it for home machines and it's the best free AV in my opinion) but I don't think it will work on the Servers (3 servers, PDC, TS, and File). They are behind a Sonicwall TZ 200. What would be the best. Free would be even better. Thank you!

    Read the article

  • Having problems with Grub2 booting Ubuntu from my External Hard Drive

    - by anonymous
    I installed Ubuntu on my external hard drive but it won't boot on my laptop. what do i do? i did some reading and traced the source of the problem to grub2. Apparently, grub2 doesn't use the device's UUID, and uses the linux directory instead (/dev/sdf2). This means that whenever i plug my E-HDD into a system that has a different number of drives connected to it, i won't be able to boot without editting the boot command. I don't understand it too well but that's what i got from what i read. Is there anyway to fix this?

    Read the article

  • how do I delete my alternate drive icon on the ubuntu desktop?

    - by broiyan
    I'm on a system with a Windows drive and an Ubuntu drive (both physical, not virtual). Under the Ubuntu Places menu, there is a "320 GB file system" which is the Windows disk. The same drive also appears as an icon on the desktop (but unlike everything else on the desktop, it does not appear in the directory listing of ~/Desktop). I think the icon was put on the desktop by accident and I never use it because the Places menu suffices. How can I delete this icon? Selecting then deleting does not work.

    Read the article

  • How can I share data from a Samsung Wave Mobile phone with the Mac OS?

    - by M. Bedi
    This is driving me up the wall just a bit. I have a new Samsung Wave mobile phone. It is running the Bada OS. The mobile phone does not come with anyone software for the Mac. The Kies desktop interface is only available for Windows. I did try installing the Kies software in a VM with Parallels 5 but it did not detect the phone connected via USB. I tried using Bluetooth file exchange on the Mac; it lets me browse the file system on the Wave phone but not actually see any of the media files; I just get empty directory views. But I am able to access files using media sharing with the Wave phone and my PS3. So what would be a Mac desks top app that can be used a media sharing browser?

    Read the article

  • <authentication mode=“Windows”/>

    - by kareemsaad
    I had this error when I browse new web site that site sub from sub http://sharp.elarabygroup.com/ha/deault.aspx ha is new my web site Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 61: ASP.NET to identify an incoming user. Line 62: -- Line 63: Line 64: section enables configuration Line 65: of what to do if/when an unhandled error occurs I note other web site as sub from sub hadn't web.config is that realted with web.config and all subs and domain has one web.config

    Read the article

  • Passenger_wsgi.py given precedence over DirectoryIndex?

    - by Walkerneo
    I was having an issue with my site today, where apache wasn't serving index.php by default. I had moved passenger_wsgi.py to the directory above document root so that I could serve python files without having to use PassengerAppRoot in the .htaccess file. I wanted to do this because I set up a development sub-domain on the site, and I wanted to use a different passenger_wsgi for the two domains, but that meant having different .htaccess files for the different PassengerAppRoots. Is there a way to have passenger_wsgi.py where it was and still let apache serve the index.phps? edit: I'm sorry, I'm tired. I just realized that the way this probably works is that passenger_wsgi.py is handling the routing instead of apache.

    Read the article

  • Storage of various linux config files

    - by stantona
    I'm using git to track/store all my various config files required for linux. They're organized as if they live in my home directory, eg: .Xresources .config/ Awesome rc.lua .xmodmap .zshrc vim/ <- submodule emacs/ <- submodule etc I use git submodules for other things like vim/emacs configuration (since I also want to keep those separate repos). I'm thinking of creating a shell script to create the various links to these files. The goal is to make it easier to setup another linux painlessly. Is this a reasonable idea? Is there a preferred approach? I'm mostly interested in hearing how others people store their configs.

    Read the article

  • Expression web ftp: Stuck at "Listing subsites"

    - by FrankPython
    When I try to use the Expression Web 4 built-in ftp I see the message ""Listing subsites in.." and soon afterward "passive ftp not available". If I switch to active, I get "active ftp not available". There are no subsites. It is a simple directory with one html page. Backend is a normal IIS6 server. FTP to the same IP with other FTP clients works fine! Any idea if Expression web has some specific requirements? It is our own dedicated server. (Please no tips to use another tool, for this specific project Expression Web is a requirement).

    Read the article

  • Problems loading Hilva tutorials

    - by Beska
    I'm a newcomer to XNA, and I'm evaluating some libraries. The Hilva Graphics Engine looks interesting, and I'm trying to run their tutorials. However, all of them give me errors. For example, if I download the ParallaxMappingSample demo, and try to build it, I get Error 1 Error loading pipeline assembly "C:\Users\Me\Desktop\ParallaxMappingSample\Hilva.Content.dll". ParallaxMappingSample I get similar errors for all of the samples. Unfortunately, this error isn't very enlightening. I can see the Hilva.Content.dll in the appropriate directory. I tried removing and readding the reference from the content project, but I get the same error. I'm not sure it's relevant, but I'm on Windows 7, I'm using Microsoft Visual Studio 2010, and XNA 4.0. Is there an easy (or difficult) solution? EDIT: If you happen to try this, even if you don't have a solution, let me know about it in a comment. Whether it works for you, or if you get the same problem...either result would be something that might let me know if it's just a problem with the tutorial, or if it's on my end.

    Read the article

  • load syntax as per file prefix

    - by Richo
    Firstly, I hope that this is the right place. I couldn't decide between here and superuser. My home directory lives in an svn repo. all my dotfiles are in version control so that I can track them across multiple machines, and they all source an unversioned .local (ie, .screenrc.local, .vimrc.local etc) which can override/make local changes to the environment in a machine specific way. The problem is that vim understands how I want to edit many of these config files, but loses it's mind when I open a .local, and honestly, I'm not really sure what it does to work out how to syntax highlight etc a file like .screenrc the pseudocode for what I'm after is: if OpenedFile.ends_with(".local") behave_as_per OpenedFile[0:-6] endif I hope this makes sense and hopefully someone can shed light on whether or not this is possible.

    Read the article

  • Nginx HTTPS redirects causing loop

    - by Ben Chiappetta
    I've been banging my head against the wall trying to figure this out, so if anyone can help I'd appreciate it. My Nginx conf has three different redirect loops, haven't been able to get any of the three to work right. The three problem areas are: Redirecting memcache directory to SSL Redirecting accounts directory to SSL Redirecting SSL to www if non-www nginx.conf: user nginx; worker_processes 1; error_log /var/log/nginx/error.log warn; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; access_log /var/log/nginx/access.log main; error_log /var/log/nginx/error.log notice; sendfile on; #tcp_nopush on; keepalive_timeout 65; proxy_set_header X-Url-Scheme $scheme; #gzip on; rewrite_log on; include /etc/nginx/conf.d/*.conf; } conf.d/default.conf: server { listen 80; server_name <redacted>.net; rewrite ^(.*) http://www.<redacted>.net$1; } server { listen 80; server_name www.<redacted>.net; set_real_ip_from 192.168.30.4; set_real_ip_from 192.168.30.5; set_real_ip_from 192.168.30.10; real_ip_header X-Forwarded-For; #charset koi8-r; access_log /var/log/nginx/host.access.log main; root /var/www/html; index index.php index.html index.htm; location =/memcache { rewrite ^/(.*)$ https://$server_name$request_uri? permanent; } location /accounts { rewrite ^/(.*)$ https://$server_name$request_uri? permanent; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { } # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000 # location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi_params; try_files $uri = 404; } # deny access to .htaccess files, if Apache's document root # concurs with nginx's one # location ~ /\.ht { deny all; } } conf.d/ssl.conf: # HTTPS server # server { listen 443; server_name <redacted>.net; rewrite ^(.*) https://www.<redacted>.net$1; } server { listen 443 default_server ssl; server_name www.<redacted>.net; set_real_ip_from 192.168.30.4; set_real_ip_from 192.168.30.5; set_real_ip_from 192.168.30.10; real_ip_header X-Forwarded-For; proxy_set_header X-Forwarded_Proto https; proxy_set_header Host $host; proxy_redirect off; proxy_max_temp_file_size 0; proxy_set_header X-Forwarded-Ssl on; set $https_enabled on; ssl_certificate <redacted>.crt; ssl_certificate_key <redacted>.key; ssl_session_timeout 5m; ssl_protocols SSLv2 SSLv3 TLSv1; ssl_ciphers HIGH:!aNULL:!MD5; ssl_prefer_server_ciphers on; root /var/www/html; index index.php index.html index.htm; location /memcache { auth_basic "Restricted"; auth_basic_user_file $document_root/memcache/.htpasswd; } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param HTTPS on; include /etc/nginx/fastcgi_params; try_files $uri = 404; } }

    Read the article

  • Twitter Tuesday - Top 10 @ArchBeat Tweets - August 12-18, 2014

    - by Bob Rhubart-Oracle
    Man in gray hat: "You know, more than three thousand people follow @OTNArchBeat on Twitter. I wonder which tweets were the most popular over the last seven days." Man in brown hat: "Shut up! I think I see a UFO!" Man in gray hat: "That's OK. I'll just read this blog post." RT @java: "Programmers are creative people and typically delight in contriving clever ways to solve problems." -Casimir Saternos in @OracleJavaMag Aug 18, 2014 at 12:54 PM The Offer Still Stands: Produce your own episode of the OTN ArchBeat Podcast. Click for details. Aug 13, 2014 at 02:03 PM Binge-Ready! Watch the Top 10 OTN ArchBeat Videos featuring @stewartbryson @stenvesterli @gurcanorhan Aug 13, 2014 at 11:49 AM Oracle Announces First Java 9 Features | InfoQ Aug 18, 2014 at 12:20 PM Getting Started wit the #Coherence Memcached Adaptor | David Felcey Aug 18, 2014 at 10:19 AM #WebLogic Data Source Connection Labeling | Steve Felts Aug 14, 2014 at 10:03 AM How to introduce #DevOps into a moribund corporate culture | ZDNet Aug 15, 2014 at 11:23 AM Sample Chapter: Installing Oracle #WebLogic Server 12c and Using the Management Tools | Sam Alapati Aug 14, 2014 at 11:09 AM Building a Responsive #WebCenter Portal Application | @JayJayZheng Aug 12, 2014 at 11:04 AM #OEM12c Cloud Control authorization with Active Directory | Jeroen Gouma Aug 14, 2014 at 10:16 AM

    Read the article

  • Is there a RAR extractor (for multiple rar files like .r00 etc.) that will use all of my quad cores?

    - by Christopher Done
    I've got a quad core Intel processor. I've got a big file split into little ones as RAR files, foo.r00, foo.r01, etc. which the RAR program extracts into one file/directory. Is there a RAR program that I can specify like "use four cores" in the extract process? At the moment it sits there using 100% of one core. I recognise the bottleneck might be my hard drive anyway, but I don't see a lot of HD usage and suspect the decompression process is more intensive than waiting on I/O. For example, GNU Make accepts a (-j, I think) argument to tell it how many cores to use, which I used to compile PHP 6 really quickly.

    Read the article

  • CMD: Append to path without duplicating it?

    - by Horst Walter
    For one CMD session I can easily set a new path: SET PATH=%PATH%;"insert custom path here" Doing so in a batch file does not consider whether the custom path is already included. How do I avoid duplicating it (i.e. check whether it is already contained in the PATH "string"). Remarks: Related: How do I append user-defined environment variables to the system variable PATH in Windows 7? Related: How can I permanently append an entry into the system's PATH variable, via command line? Same question for UNIX: Add directory to $PATH if it's not already there

    Read the article

  • After installing updates, Windows 7 reboots when it gets to "Preparing to Configure"

    - by Travis
    After letting Windows 7 Pro install updates at shutdown I now cannot get it to boot back up. I have tried selecting safemode, and "last known working configuration" and I get the same results. It gets to the screen and says "Preparing to configure Windows" and then reboots. If it is giving me a BSOD it is happening so fast I can not see it. This is a generic PC that has been running fine for the last year. It had 22 updates to do at shutdown. Windows 7 Pro Service Pack 1 64 bit in an Active Directory Domain.

    Read the article

  • Mac failing (failed?) hard drive - is all hope lost?

    - by Daniel
    It's a 500 GB Seagate laptop hard drive that came with my Macbook Pro. Apple partition format. Already replaced and now have it external, connected via SATA/USB adapter. Trying to get just a few files that I worked on while out of town when it crashed (and thus did not have my time machine backup drive). Drive will not mount, but OS X Disk Utility detects it and can read the capacity, model number, and even the name of the partition, which leads me to believe all hope may not be lost. Failed attempts so far: Disk Utility verify+repair says drive cannot be repaired and that I should back up immediately (lovely) Disk Warrior says it cannot rebuild the directory due to hardware failure Data Rescue quick & deep scans immediately failed PhotoRec says "error reading sector" for every sector (at least for the few minutes I let it run before closing it to explore other options) What else can I try here? Again, I'm just looking for a few, small files (python scripts to be specific) - not a full recovery.

    Read the article

  • Best practice to create an ftp administrator account on vsftpd

    - by jtd
    Background: My manager would like me to create an administration account for out FTP server. When logged in via ftp, it should instantly display all of the home directories of the users, and be able to modify any directory or file in any way possible. What would be the best way to go about this? I planned on chrooting this ftp admin to /home, but I don't know how to properly go about the permissions. Maybe make a group called ftp_admins, and chgrp the /home folder? But then wouldn't it affect the users accessing their folders? any help is appreciated.

    Read the article

  • SIMPLEST way to set up password protection for a static site, with basic admin UI?

    - by Joseph Turian
    I have a static site. I would like the simplest approach to password protecting a directory, with a basic admin UI for adding/removing users. I will have so few users that I don't care about performance. I don't care if it's PHP or Django or whatever, I just want a complete software package. Apache basic auth isn't good, because you can't log out. Nor is there a UI for adding users. I tried throwing everything behind Django auth and serving the files through Django. However, Chrome treats all my text/css headers as text/plain, so I don't get any stylesheets showing. I can't use mod_xsendfile on my server because I can't reconfigure Apache to add new modules. I think this approach is overkill anyway. I can try configuring Nginx's X-Accel-Redirect, however that requires implementing all the Django code for auth myself, and I'd prefer an existing solution. However, this is my backup plan. Is there a code package that implements authentication with basic admin for a static site?

    Read the article

  • Install Git on a Media Temple (dv) 4.0 server

    - by Chris
    I'm trying to install Git on my Media Temple (dv) 4.0 server. I've followed these instructions. It seems to have "installed", as there are a boat-load of files in the /root/git-2012-06-06 directory. However, when I perform any git command in the server, I receive this message: git: command not found My assumption is that something, somewhere is not configured properly, but I have no idea where to start. Could anybody lend a hand / offer some pointers? (And if you hadn't guessed, I'm pretty new to all this, so please be kind!)

    Read the article

  • How can I find out what .desktop file is being launched?

    - by iBelieve
    I've used click install and click register to install a click app on Ubuntu (not Ubuntu Touch). The version was 0.5.1. Now, a new version (v0.5.6) is available and I installed it using the same method. I know the new version is installed because the current directory points to 0.5.6: $ ll /opt/click.ubuntu.com/com.ubuntu.developer.mdspencer.ubuntu-tasks/ total 16 drwxr-xr-x 4 clickpkg clickpkg 4096 Oct 18 10:19 ./ drwxr-xr-x 8 clickpkg clickpkg 4096 Sep 13 21:22 ../ drwxr-xr-x 10 clickpkg clickpkg 4096 Sep 13 20:01 0.5.1/ drwxr-xr-x 10 clickpkg clickpkg 4096 Oct 18 10:19 0.5.6/ lrwxrwxrwx 1 clickpkg clickpkg 5 Oct 18 10:19 current -> 0.5.6/ However, when I launch the application from the Dash, the about page still shows v0.5.1. So my question is, how can I find out where the .desktop file that I'm launching resides so I can understand why the correct version isn't being launched? I'm also simply curious to learn where the click .desktop files live. Is there some tool that shows where a given .desktop file is, or is there a way to see the equivalent to $PATH for .desktop files? Note: this is similar to, but not a duplicate of, How to find the .desktop files for pinned applications in the Unity launcher?

    Read the article

  • anonymous access to ftpd

    - by gcb
    I need FTP on my local LAN so my scanner can send me scans. I am on Debian sid. I installed ftpd and created the user anonymous. I removed anonymous from /etc/ftpusers When I try to login as anonymous it says: login failed. I can't seem to find any log file. I'm using FTP server (Version 6.4/OpenBSD/Linux-ftpd-0.17) update:I needed that so my printer had one place to store my scanned documents on the LAN. solved that buying a dedicated USB pen drive (thankfully I bought a printer one with plenty of options) but in the past this was a very simple task. Install ftpd, specify the directory for anonymous access, and be done with it. now every ftp daemon has several layers of security and chroot in it's core... this is just crazy when all you need a place for a printer to dump files in a secured network.

    Read the article

< Previous Page | 475 476 477 478 479 480 481 482 483 484 485 486  | Next Page >