Search Results

Search found 24207 results on 969 pages for 'anonymous users'.

Page 271/969 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • Can I see if and when a file was deleted on Windows Server 2003?

    - by user316687
    On Windows Server 2003, is there a way to see if and when a file was deleted? It's a web server with IIS, our web application let our users to load Word documents into server. However, we found that one Word file is missing, and would like to know is it was deleted or never existed (web app could'nt load it). EDIT: I tried to follow this: Enable auditing the folder you want to keep track of. Just right click on the folder, go to “sharing and security”, then “security” tab, at the bottom click on “advanced”. Select the auditing tab, click add, select the group or users to track, then pick what actions you want to track. To track file deletion you would enable: Create files/Write data Success/Fail Create folders / append data Success/Fail Delete Subfolders/Files Success/Fail Delete Suceess/Fail This one will apply from now on, past actions wouldn't be able to track?

    Read the article

  • Architectural advice - web camera remote access

    - by Alan Hollis
    I'm looking for architectural advice. I have a client who I've built a website for which essentially allows users to view their web cameras remotely. The current flow of data is as follows: User opens page to view web camera image. Javascript script polls url on server ( appended with unique timestamp ) every 1000ms Ftp connection is enabled for the cameras ftp user. Web camera opens ftp connection to server. Web camera begins taking photos. Web camera sends photo to ftp server. On image url request: Server reads latest image on hard drive uploaded via ftp for camera. Server deleted any older images from the server. This is working okay at the moment for a small amount of users/cameras ( about 10 users and around the same amount of cameras), but we're starting to worrying about the scalability of this approach. My original plan was instead of having the files read from the server, the web server would open up an ftp connection to the web server and read the latest images directly from there meaning we should have been able to scale horizontally fairly easily. But ftp connection establishment times were too slow ( mainly due to the fact that PHP out of the ox is unable to persist ftp connections ) and so we abandoned this approach and went straight for reading from the hard drive. The firmware provider for the cameras state they're able to build a http client which instead of using ftp to upload the image could post the image to a web server. This seems plausible enough to me, but I'm looking for some architectural advice. My current thought is a simple Nginx/PHP/Redis stack. Web camera issues post requests of latest image to Nginx/PHP and the latest image for that camera is stored in Redis. The clients can then pull the latest image from Redis which should be extremely quick as the images will always be stored in memory. The data flow would then become: User opens page to view web camera image. Javascript script polls url on server ( appended with unique timestamp ) every 1000ms Camera is sent an http request to start posting images to a provided url Web camera begins taking photos. Web camera sends post requests to server as fast as it can On image url request: Server reads latest image from redis Server tells redis to delete later image My questions are: Are there any greater overheads of transferring images via HTTP instead of FTP? Is there a simple way to calculate how many potential cameras we could have streaming at once? Is there any way to prevent potentially DOS'ing our own servers due to web camera requests? Is Redis a good solution to this problem? Should I abandon PHP/Ngix combination and go for something else? Is this proposed solution actually any good? Will adding HTTPs to the mix cause posting the image to become too slow? Thanks in advance Alan

    Read the article

  • PATH variable recently changed

    - by Dan
    $ loopy Command 'loopy' is available in '/usr/games/loopy' The command could not be located because '/usr/games' is not included in the PATH environment variable. loopy: command not found Every answer I've found just says to add it to my .profile... but this should be be in the PATH for all users, and was up until recently (I have no idea what would have caused it to change). How can I solve this on my system for all users? What could have caused this to change? Thanks.

    Read the article

  • rsync --link-dest behaviour when run as sudo

    - by fotNelton
    In order to create regular backups, I'm using rsync together with --link-dest so as to create hard-links for unchanged files. For example: rsync -ax \ --partial --delete --delete-excluded --inplace \ --exclude-from=/tmp/temp_excludes \ --link-dest=/Volumes/Backup/current \ /Users /Volumes/Backup/2012-06-25 This works very well as long as I start the process from my normal user account. Though as soon as I start the process using sudo it behaves erradically, meaning that rsync copies all the unchanged files instead of hard-linking them. Since sudo modifies the environment, I've already also tried sudo -E in conjunction with making sure that my sudoers file has the corresponding option set. Well, that didn't work either. So, the question is, how can I run rsync using sudo? Whereas the above example only shows a backup of the Users directory, I also need to backup some system files that I can only access as root.

    Read the article

  • How to setup email server in ubuntu 12.04LTS(debian 7 wheezy/sid) running on linode vps

    - by shihon
    I am working on email server, since i tried several times to create email server on ubuntu12.04LTS with postfix + dovecote + postfixadmin + courier + clamav + spamassassin. But everytime i install these packages i face new problems, like mails send to localhost users and found in users maildir. But I can't determine how to configure/setup for send an email to external smtp like gmail, yahoo. The most worst thing i can't determine how to use sasl, because i am not using SSL so it is not worthy for my domain. This is so complicated, i search everywhere on google: links are https://help.ubuntu.com/community/PostfixCompleteVirtualMailSystemHowto http://www.starbridge.org/spip/spip.php?article1&lang=fr http://knopix.wordpress.com/2008/01/16/postfixadmin-postgresql-courier-squirrelmail-on-debian-etch-howtotutorial/ http://flurdy.com/docs/postfix/ Is there any article for install email server on ubuntu 12.04LTS. Please help me to understand these things.

    Read the article

  • What does the red x icon mean next to a user in folder permissions (Windows 7)

    - by Scott Szretter
    In trying to debug various strange issues on a machine, I found something strange - when I go to C:\Users\administrator and get properties, security tab, it lists the users (the local admin account, system, and 'administrator' which is the domain administrator account). It all looks fine in terms of permissions (full control, etc.) compared to other machines. The one difference is there is a small red circle with an X to the left of the user icon/name. Additionally, there are various folders where it says access denied under there - for example, my documents! Even logged in as the local machine administrator account (which is not named administrator), I am unable to change the permissions - it says access denied. Any ideas what this means and how to fix it? I even tried re-joining the machine to the domain.

    Read the article

  • Linux router with diffent gateways for incomming and outgoing connections

    - by nkout
    I have the following topology: LAN Users:192.168.1.2 - 254 (192.168.1.0/24) gateway1: 192.168.2.2/24 used for all outgoing connections of LAN users (default gateway) gateway2: 192.168.3.2/24 used for incoming services (destination NAT, ports 80,443 are forwarded to 192.168.2.1) linux router-server R eth0 192.168.1.1/24: LAN eth1 192.168.2.1/24: WWAN1 eth2 192.168.3.1/24: WWAN2 I want to: route all outgoing traffic coming from LAN and R via 192.168.2.2 route the responses to incoming connections via 192.168.3.2 My config: ifconfig eth0 up 192.168.1.1 netmask 255.255.255.0 ifconfig eth1 up 192.168.2.1 netmask 255.255.255.0 ifconfig eth2 up 192.168.3.1 netmask 255.255.255.0 echo 0 >/proc/sys/net/ipv4/ip_forward route add default gw 192.168.2.2 iptables -t nat -A POSTROUTING -d !192.168.0.0/16 -j MASQUERADE I want to add iptables rule to mark incoming traffic from WWAN2 and send back the responses to WWAN2, while keeping default gateway on WWAN1

    Read the article

  • Repeat the csv header twice without "Append" (PowerShell 1.0)

    - by Mark
    I have prepared a PowerShell script to export a list of system users in CSV format. The script can output the users list with Export-csv with single header row (the header row at top). However my requirement is to repeat the header row twice in my file. It is easy to achieve in PowerShell 3.0 with "Append" (e.g. $header | out-file $filepath -Append) Our server envirnoment is running PowerShell 1.0. Hence I cannot do it. Is there any workaround? I cannot manually add it myself. Thank you.

    Read the article

  • How can I check the actual size used in an NTFS directory with many hardlinks?

    - by kbyrd
    On a Win7 NTFS volume, I'm using cwrsync which supports --link-dest correctly to create "snapshot" type backups. So I have: z:\backups\2010-11-28\cygdrive\c\Users\... z:\backups\2010-12-02\cygdrive\c\Users\... The content of 2010-12-02 is mostly hardlinks back to files in the 2010-11-28 directory, but there are a few new or changed files only in 2010-12-02. On linux, the 'du' utility will tell me the actual size taken by each incremental snapshot. On Windows, explorer and du under cygwin are both fooled by hardlinks and shows 2010-12-02 taking up a little more space than 2010-11-28. Is there a Windows utility that will show the correct space acutally used?

    Read the article

  • What strategy should be employed to access Facebook data offline?

    - by user686021
    I'm working on a project similar to Klout which provides detail about how you influence other people and who influenced you. We'll be fetching data from few social networking sites (i.e linked in, facebook, twitter etc) to analyze how users interacts with one another. For that we need to parse the data and store it in db and have to analyze it so that strength of relation of two user can be decided. We'll be accessing data offline as well to provide them with accurate results. If we consider facebook activities, we need to have access to Facebook users' news feed, wall data which includes likes,comments,shares etc. To decide how one user influence other, we'll store all the data and analyze it. I need suggestions on what steps need to be taken for great performance. We'll be using ASP.Net(C#) Web forms, SQL Server, jQuery. Main concern is parsing of data, it's storage and retrieval with least overhead. For that I've summarized few points as below : Should we switch over to document-oriented database, like MongoDB or RavenDB for the whole app or part of it even though none of team member have experience with them? Should we use SQL Server Analysis service? Is there any other library than Json.NET for parsing data? Is it advisable to use any C# library over FQL + GET Request ? I've tried to provide as much info as possible. Please share your views for the same.

    Read the article

  • Windows 8 Remote Desktop only allows one user at a time?

    - by segmentation fault
    I tried connecting to Windows 8 using its built-in Remote Desktop feature, but for some inexplicable reason, it requires that no users are logged in on the target machine before a remote user can log in. This has never been a problem with rdesktop on Unixen; I could rdesktop from as many machines as I wanted and any logged-in users would never notice a thing. What's the problem with Windows? Any way to allow concurrent local and remote logins to a Windows 8 machine without hacks or cracks? The "guides" on how to do this that show up in the Google results all suggest replacing a system DLL with a hacked one, but that's not acceptable.

    Read the article

  • Is there a proven concept to website reverse certificate authentication?

    - by Tom
    We're looking at exposing some of our internal application data externally via a website. The actual details of the website aren't that interesting, it'll be built using ASP.NET/IIS etc, that might be relevant. With this, I'm essentially I'm looking for a mechanism to authenticate users viewing my website. This sounds trivial, a username/password is typically fine, but I want more. Now I've read plenty about SSL/x.509 to realise that the CA determines that we're alright, and that the user can trust us. But I want to trust the user, I want the user to be rejected if they don't have the correct credentials. I've seen a system for online banking whereby the bank issues a certificate which gets installed on the users' computer (it was actually smartcard based). If the website can't discover/utilise the key-pair then you are immediately rejected! This is brutal, but necessary. Is there a mechanism where I can do the following: Generate a certificate for a user Issue the certificate for them to install, it can be installed on 1 machine If their certificate is not accessible, they are denied all access A standard username/password scheme is then used after that SSL employed using their certificate once they're "in" This really must already exist, please point me in the right direction! Thanks for your help :)

    Read the article

  • Accessing our Intranet from outside our Network - WITHOUT VPN

    - by westexasman
    We just upgraded our company intranet from an IIS based, ASP (poorly written) server/code base to a Windows Server 2008 r2 (Apache/MySQL/PHP) server. The old server allowed users to login to intranet.xxx.org using there AD user/pass which then lead them to the company Intranet from basically anywhere they had Internet access. We want to mimic that functionality (or change it to something more secure) with the new setup. This was seemingly setup for off-site employees running on a state network. The state network does not allow VPN, therefor, we needed a way to allow those employees access to the Intranet. So, how do we go about allowing users to login from the outside world and gain access to our Intranet?

    Read the article

  • reverse_proxy (mod_rewrite) and rails

    - by SooDesuNe
    I have front end rails app, that reverse proxies to any of a number of backend rails apps depending on URL, for example http://www.my_host.com/app_one reverse proxies to http://www.remote_host_running_app_one.com such that a URL like http://www.my_host.com/app_one/users will display the contents of http://www.remote_host_running_app_one.com/users I have a large, and ever expanding number of backends, so they can not be explicitly listed anywhere other than a database. This is no problem for mod_rewrite using a prg:/ rewrite map reverse proxy. The question is, the urls returned by rails helpers have the form /controller/action making them absolute to the root. This is a problem for the page served by mod_rewrite because links on the proxied page appear as absolute to the domain. i.e.: http://www.my_host.com/app_one/controller/action has links that end up looking like /controller/action/ when they need to look like /app_one/controller/action Is there a way to fix this server-side, so that the links will be routed correctly?

    Read the article

  • Linux File Permissions & Access Control Query

    - by Jason
    Hi, Lets say I am user: bob & group: users. There is this file: -rw----r-- 1 root users 4 May 8 22:34 testfile First question, why can't bob read the file as it's readable by others? Is it simply that if you are denied by group, then you are auto-blacklisted for others? I always assumed that the final 3 bits too precedence over user/group permission bits, guess I was wrong... Second question, how is this implemented? I suppose it's linked to the first query, but how does this work in relation to Access Control, is it related to how ACLs work / are queried? Just trying to understand how these 9 permission bits are actually implemented/used in Linux. Thanks alot.

    Read the article

  • Constant prompts for credentials from one Mac Outlook 2011 client

    - by Top__Hat
    The majority of my Exchange users are all on Windows 7 and have no issues (at least using Outlook...) but a subset of the executives are ardent Mac users running Outlook 2011 for OS X. One of these clients is prompted every 5-10 minutes for credentials. Ticking the checkbox to remember credentials does not fix the situation. Mac version is 10.7.2. I have already removed and rebuilt the EWS virtual directory on my Client Access server. Outlook anywhere is set to NTLM authentication. None of the Microsoft clients are experiencing this issue. What else can I do to make this issue go away?

    Read the article

  • Bayesian content filter for vbulletin [on hold]

    - by mc0e
    I've been tasked with coming up with a tool to automatically flag some posts for moderator attention on a large vbulletin forum. It's not spam per se, but the task has a lot in common with the sort of handling that might be done by a spam protection plugin (a mod in vbulletin speak). There's only so much I can say, but the task does not involve bad users, so much as particular kinds of posts which the moderators need to be aware of. Filtering out user registrations and links is therefore not useful, and we are talking about posts by real human users. What I'm looking for is an existing bayesian classification plugin, or something that I can study to get an understanding of how to do the vbulletin side of the interface in order to build such a thing. Ie I'd need ways for moderators to list flagged posts, and to correct the classification of posts which have been mis-classified. Ideally I want a 3 way split with an "unsure" category in order to reduce what has to be reviewed to find any mis-classifications. Any pointers? I've searched around a bit, and so far what I've found has been more or less entirely targetted at intervening in sign-ups (mostly using stopforumspam), captchas, and use of external services like akismet which are spam specific. I'm also considering an external solution, which might be ableto be interfaced i

    Read the article

  • How can I unobstrusively backup a few client's email?

    - by tladuke
    This is a small office. Our web/email server is a shared host. In the office we have a windows 2008 box up all the time that runs our NAS and a couple other services. I don't have access to the ISP admin stuff, but I assume it has cpanel or something like that. I can get access if I ask. I want to get email backed up from the server to our NAS without the users having to do anything. I suppose I could set up Outlook on that server with everyone's account, but that's a terrible idea, maybe (would sent mail. The boss uses outlook, but we have Apple Mail and Thunderbird clients too. I guess the important thing is that outlook look at the backups, so boss is happy. Then again, maybe it should be stored in whatever is the most portable format (that will work on NTFS) This is for about 10 users.

    Read the article

  • Outgoing only SMTP server

    - by Din
    I want to setup on my Debian outgoing only SMTP server, so my Web-applications will be able to send mail via it. I don't want to use other hosts because of security and customizability of my own setup. I want to setup Postfix and configure it only in outgoing mode (I don't want it to be a relay). So I want to ask an advice how to do it in the best way. There's no need in users, virtual users, endpoints and other options that Postfix provides. I suppose that I should only attach Postfix to some hostname so my IP address can resolve to it. I also think that Postfix maybe a bit complicated solution for this task. If you know much simpler tool for it, let me know. Thanks.

    Read the article

  • Why Java applicaiton is running slow for clients on Linux

    - by Darshani
    We have two linux servers which have Apache Tomcat Installed. One is running the Database, the other is running with Java application. More than 100 users are connected to the servers. Server machines are quad core normal PC's which have 4 GB memory. It was running properly for the last 6 months but the application has recently started to run slowly . Suddenly the Java application is getting stuck & users cannot work for some time. There is no network issue. I am trying to identify the reason for this, whether it is a machine problem or the problem is in the Java application. Can any body help me with this?

    Read the article

  • An international mobile app - Should I set up EC2 instances in multiple regions?

    - by ashiina
    I am currently trying to launch an mobile app for users around the world. It is not a spectacular launch which will get millions of users in weeks - just another individual developer releasing an app. I know enough about the techniques of managing timezones, internationalizing string, and what not ( the application layer ). But I cannot find any information on how I should manage my EC2 instances... Should I be setting up EC2 instances in different regions around the world? Is that a must-do, or is it an overkill? I'm aware that it's the ideal solution in terms of performance, but it becomes very tough managing servers in multiple regions. DB issues, AMI management, etc... I'd much rather NOT do so. So I would like to know the general best practice when launching an international app/website. Note: For static contents, I know it's better to use a CDN, so I'm planning on doing so.

    Read the article

  • Dynamic authentication realms in Apache

    - by Cogsy
    I have a front end server acting as a gateway proxy for many (a dynamic 'many') building monitors with embedded webservers. They are accessed with a URL like: http://www.example.com/monitor1/ http://www.example.com/monitor2/ ... I'm trying to restrict access to these monitors to only the users that own them. So what I need is a way of specifying rights to users or groups for specific directories. The standard auth mechanisms I see in Apache won't work because I need to specify every location. I'd prefer some dynamic map or script. Any suggestions?

    Read the article

  • Which programming language should I choose I want to build this website ...? [closed]

    - by Goma
    Assuming that I will start with just phot sharing website. Every user can add comments to any photo. After that the site will contain news (general news), the admin can add any news and the moderators as well while the users can also add comments on this news. The website will aslo provide photos uploader, so every user will have up to 20 MB ti upload any photos they want. Other users can see these photos or can not depending on the option that the main user chose(if he wants to publish his photos or not). The site should have a small type of forum which provide the ability for admin to ad categories and for user to add topics and replies for each topic in these categoris. These are the things that I can think of now, but the website will add other features as well and services later on. Can you tell me now which programming language can help me to do all that? I need a programming language that provdies the follwing: 1- speed load for pages of the site. 2- easy to add more functions quickly and easy to edit code for any reason. 3- Secure 4- fast in displaying infromation from database.

    Read the article

  • WiFi problems on several Ubuntu installations

    - by Rickyfresh
    Okay this is the first time I have ever had to ask a question as usually the Ubuntu community have answered everything already but on this occasion there are many people asking for the answer but not one good solution has become available so far so someone please help or I will have to install Windows on my sons and my girlfriends PCs and that would be a disaster as I am trying to help convince people to move from Windows. I installed 12.04 on three computers on the same day. Dell Inspiron (Works Perfect) Toshiba Satellite Home built Desktop The Dell works perfect but the other two either keep losing connection to the wireless Internet and even when they are connected they stop connecting to web sites, for some reason it searches Google fine but will not connect to web sites when a link is clicked. So far people have recommended in other forums: Removing network manager and installing wicd (didn't solve it) Changing the MTU in the wireless settings (didn't solve it) All sorts of messing about with Firefox settings (this doesn't solve it and even if it did this would leave most average PC users scratching their heads and wishing they had stuck to windows) The problem exists on two very different machines and different wireless cards so I doubt its a driver or hardware issue, also many other Ubuntu users are having the same problem with a vast array of different machines and wireless cards. Can someone please give a good solution to this as its going to turn a lot of people away from Ubuntu if they cannot get this sorted. I would give some PC specs but the two machines are vastly different and the other people complaining of this problem also have very different systems all showing the same problem.

    Read the article

  • Best CMS for review-type sites

    - by Pru
    Is there an ideal CMS for making a review site? By review site, I mean like a restaurant review site where you have each entry belonging to different major categories like Cuisine and City. Then users can browse and filter by each or by combination (Chinese Food in Los Angeles, with suggestions of other Chinese restaurants in LA, etc). Furthermore, I'd want it to support other fields like price, parking, kid-friendliness, etc. And to have users be able to filter by those criteria. I've been told that with a combination of custom taxonomies, plug-ins and many clever little queries, that Wordpress 3.x can handle this. But I'm having a heck of a time with it getting into the nitty gritty, and that's where I find the community support is lacking. The sort of stuff you'd think would work in WP, like making one parent category for Cuisine and one for City, don't really work once you get further in and start trying to pull it all together. Then you find these blog posts where people say, "This example shows that one could create a huge movie review site using custom taxonomies..." but when you go and try it you hit all sorts of challenges and oddities that point a big long finger at Wordpress being in fact a blogging platform. The best I came up with was one category for the cuisine and one tag for the city, then I created a couple of custom tag-like taxonomies for the other features. It's quite a mess to try to figure out how to assemble all of that into a natural, intuitive site. I expect a few versions down the road WP will be able to do these sorts of sites out of the box. So I thought I'd take a step back before I run back into the Wordpress fray and find out if maybe there is another platform better suited to this sort of relational content site. Directory scripts in some ways offer many of the features I'm looking for, but I need something more flexible and, hopefully, interactive (comments, reviews). I'm especially looking for feedback from people who've crafted sites like this. Thanks!

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >