Search Results

Search found 6311 results on 253 pages for 'limit clause'.

Page 147/253 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • Windows Server 2003: Remapping external domain

    - by Chuck Harmston
    We're playing a going-away prank on a coworker, and would like to use a rule in our internal DNS server to redirect techcrunch.com to point at one of our internal development servers. Basically, I'd like to accomplish the same thing as adding a line to a Linux /etc/hosts file, only for the entire network. I have access to our DNS server. How would you go about doing this? I created an entry in the reverse lookup subnet with the 'Host Name' of techcrunch.com and the 'Host IP' of our development server, a Linux box running Debian on which I've created a virtualhost to handle requests to techcrunch.com. It doesn't appear to be working, however, and my expertise has reached its limit. Thanks!

    Read the article

  • System will only boot into a Live CD with core disabled in BIOS and ACPI off.

    - by CookieOfFortune
    I have a system that was running a Q6600 and Windows 7 RC. It crashed last night into a BSOD with a MACHINE_CHECK_EXCEPTION. Now it does that every time it boots into Windows. I tried using my Ubuntu Live CD, but the kernel would error out, the stack trace showed something along the linse of "Can not synchronized to one of the CPUs". Working from this, I enabled the "Limit CPU to 3 cores" option in my BIOS and tried again. This time, it seemed to have died after an ACPI call, so I disabled that during the boot and now it is running from the Ubuntu Live CD, showing 2 cores. Does anyone here think I have any hope or is it simply a CPU waiting to die? EDIT: 1 core now.

    Read the article

  • Install multiport module on iptables

    - by tarteauxfraises
    I'am trying to install "fail2ban" on Cubidebian, a Debian port for Cubieboard (A raspberry like board). The following rule failed due to "-m multiport --dports ssh" options (It works, when i run manually the command without multiple options). $ iptables -I INPUT -p tcp -m multiport --dports ssh -j fail2ban-ssh" iptables: No chain/target/match by that name. When i make a cat on "/proc/net/ip_tables_matches", i see that multiport module is not loaded: $ cat /proc/net/ip_tables_matches u32 time string statistic state owner pkttype mac limit helper connmark mark ah icmp socket socket quota2 policy length iprange ttl hashlimit ecn udplite udp tcp What can i do to compile or to enable the multiport module? Thanks in advance for your help

    Read the article

  • Preventing my postfix to send my local users spam

    - by Jack
    I have a postfix/dovecot mail server with 100 different users. When they send an email they need to be authenticated. I successfully use saslauth to achieve this. Few days ago I had a problem. One specific user, probably with a virus or a spam-bot installed in its computer, started to send out through my server thousands of emails in few hours. As result, my ip has been blocked by many isp provider (@aol, @yahoo, and others) and has been listed in many blacklist, making all my 100 users unable to send any email to anyone. What is the best practice to avoid this problem? It would be great if my server could recognize a spamming user and automatically block it. Also, have a limit of, say, 30 emails per hour could be a partial solution. Any idea how to face this problem? Thank you

    Read the article

  • How to secure Apache for shared hosting environment? (chrooting, avoid symlinking...)

    - by Alessio Periloso
    I'm having problems dealing with Apache configuration: the problem is that I want to limit each user to his own docroot (so, a chroot() would be what I'm looking for), but: Mod_chroot works only globally and not for each virtualhost: i have the users in a path like the following one /home/vhosts/xxxxx/domains/domain.tld/public_html (xxxxx is the user), and can't solve the problem chrooting /home/vhosts, because the users would still be allowed to see each other. Using apache-mod-itk would slow down the websites too much, and I'm not sure if it would solve anything Without using any of the previous two, I think the only thing left is avoiding symlinking, not allowing the users to link to something that doesn't belong to them. So, I think I'm going to follow the third point but... how to efficiently avoid symlinking while still keeping mod_rewrite working?! The php has already been chrooted with php-fpm, so my only concern is about Apache itself.

    Read the article

  • Windows network: deny file access to another user if file is currently open

    - by Steve
    Some users on my network are having difficulties saving files, because the file is open elsewhere. Let's say Lemuel wants to edit a file, but Bernice in the next office over is working on it. Lemuel opens, edits, and tries to save, but then gets a "no write access" error. Bernice chortles with glee (since earlier that week Lemuel stole her sandwich). Unfortunately, various softwares will not warn the user that they have opened a read-only file. Is there a way (in Windows) to limit file access to ONE user only, i.e. 777 for the first user to open the file, and 000 for all users after that? (Sorry for the Linux terminology but it gets my point across).

    Read the article

  • Warming up an IIS Application Pool automatically?

    - by Michael Stum
    So IIS likes to shut down app pools that aren't in use. While this makes sense, I would like to have certain app pools conterminously running, but I don't want to just disable the automatic app pool restart as some of the settings (e.g., maximum memory limit) are good to have. I know that Microsoft announced the IIS Application Warmup module as an IIS 7.5 feature only then to do a Bait & Switch and pull it again so that they can put it in IIS 8 instead, so I wonder if something exists to run on IIS 7.5/Windows 2008 R2?

    Read the article

  • Limited access to Amazon S3 buckets

    - by Tomas Markauskas
    Is it possible to somehow limit the access to an Amazon S3 account. I don't really like the idea of distributing my secret access key to all of my applications, that want to access just a single bucket on my account. If someone gains access to one of the applications, I could loose all my data stored on S3. One way I was thinking to do it would be creating a second S3 account and give it access to just one bucket of the main account, but it's not really a great solution. Another nice thing for me would be to give the secondary account only write (but not modify/delete) and read access. That way I could upload backups or other files and be sure, that they won't get lost.

    Read the article

  • ESXi 5.1 - Unable to register host

    - by deanvz
    I download and successfully installed ESXi 5.1. I am however unable to get the licence key I received installed. An error occurred when assigning the specified licence key: The system Memory is not satisfied with the 32 GB of Maximum memory limit. Current with 80.00 GB of Memory. Is there now way around this? A quick google revealed that this is a global problem with no real answer or resolution. The only workaround is to remove the physical RAM chips, but as this is in going to be in production I dont want to do that as it would mean down time when I have to reinsert the memory

    Read the article

  • Prevent Java application from accessing/monitoring/altering clipboard contents

    - by mcstrother
    I'm a student using a service that provides practice questions for standardized tests. The service requires that I access the questions by downloading and running a Java application. If I try to copy anything from any window of my computer (including applications unrelated to the question bank) while the application is running, the copied item is replaced with an obnoxious message asking me to not pirate their copyrighted material. I find this obnoxious, and I also really don't like the idea that any application can slurp up any and all potentially sensitive information that I happen to copy while it's running. Is there are a way to limit the privileges of this application to stop it from doing this? Thank you!

    Read the article

  • linux: upload / download difference on network shares

    - by Batsu
    I have a Red Hat Enterprise Linux 6 (with SELinux) which shows significant differences of speed between download and upload (the latter significantly slower) of files shared over the LAN. The bottleneck seems to be the output of the linux machine since I have a rate around 1Mb/s when WinXP machines download files shared (using samba) by the RHEL machine uploading files from the RHEL to a WinXP's shared folder while uploading from the XP machines to linux's shares downloading XPs' shares on the RHEL any share between Windows machines only run smooth (around 50Mb/s). Since the upload from RHEL to WinXP's share is slowed too I would exclude an issue in the configuration of samba. What could possibly determine this limit in the upload speed? update: iptables doesn't show any output rule and disabling it doesn't show any noticeable difference, so I would rule out it too.

    Read the article

  • XAMPP Closes the connection

    - by Miro Markarian
    I want my XAMPP Apache server to host a file (The file is around 250mb) but the server closes the connection and won't let me download the file? Does xampp or apache have any download size limit or something? Tested with a smaller file , the problem is still present. It just doesn't let me download any file from the server.!!! All I get in the error log is this: Fri Sep 07 23:21:31.742625 2012] [authz_core:debug] [pid 3664:tid 396] mod_authz_core.c(808): [client x.x.x.x:23409] AH01628: authorization result: granted (no directives), referer: http://ammiprox.tk/greeneyes2910/

    Read the article

  • How to manage a large email delivery volume from a Email Marketing App ?

    - by Newtonx
    We provide Email Marketing service through our online Application. We have about 30 customers. And each one has it's own mailling list (5k to 100k emails each). What we really want is to distribute email's delivery between 2 or more servers. I was wondering What kind of aproach/solutions MailChimp , Constant Contact uses to provide a great service ? use many servers ? many IPs ? Our spam policy suspends ANY user/customer that gets 10% bounced . We currently rotate our outgoing Mail Ip once deliveries limit per remote host is reached. Is it the best approach/solution ?

    Read the article

  • How to manage a large email delivery volume from a Email Marketing App ?

    - by Newtonx
    We provide Email Marketing service through our online Application. We have about 30 customers. And each one has it's own mailling list (5k to 100k emails each). What we really want is to distribute email's delivery between 2 or more servers. I was wondering What kind of aproach/solutions MailChimp , Constant Contact uses to provide a great service ? use many servers ? many IPs ? Our spam policy suspends ANY user/customer that gets 10% bounced . We currently rotate our outgoing Mail Ip once deliveries limit per remote host is reached. Is it the best approach/solution ?

    Read the article

  • squid ip based authentication

    - by Ian R.
    I have 10 ip's on a VPS and squid3 installed. I want to lease all of them to 10 co-workers. The authentication should be ip-based. Basically I want to allow only their home ip address (not internal - we're not on a network) to connect to my squid. I would also like to offer them a dedicated ip from my outgoing addresses. I managed to get it working using username/password based authentication but some software do not support that feature so I would like to switch to this limit if possible. Any guidance/sample acl's?

    Read the article

  • Need help translating rate limiting iptables rules to Puppet format

    - by geoffroy
    I use Puppet Iptables module to manage Iptables rules on my machine. I'd like to implement to rate limit failed SSH connections as described here : Hundreds of failed ssh logins iptables -A INPUT -p tcp --dport 22 -m recent --update --seconds 60 --hitcount 5 --name SSH --rsource -j DROP iptables -A INPUT -p tcp --dport 22 -m recent --set --name SSH --rsource -j ACCEPT Is it possible to translate it to Puppet syntax, such as firewall { '015 drop 5 failed attemps to connect to SSH in a minute ': proto => 'tcp', port => 22, action => 'drop', // what are the other paramters ? } Any help welcome. Best regards Geoffroy

    Read the article

  • How to know my wireless card has injection enabled?

    - by shrimpy
    I am playing around with aircrack. And was trying to see whether my wireless card on my laptop can pass the injection test And I end up seeing the following... does it mean my wireless card is not able to run aircrack? root@myubuntu:/home/myubuntu# iwconfig lo no wireless extensions. eth0 no wireless extensions. eth1 IEEE 802.11bg ESSID:"" Nickname:"" Mode:Managed Frequency:2.437 GHz Access Point: Not-Associated Bit Rate:54 Mb/s Tx-Power:24 dBm Retry min limit:7 RTS thr:off Fragment thr:off Power Management:off Link Quality=5/5 Signal level=0 dBm Noise level=-57 dBm Rx invalid nwid:0 Rx invalid crypt:781 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0 root@myubuntu:/home/myubuntu# aireplay-ng -9 eth1 ioctl(SIOCSIWMODE) failed: Invalid argument ARP linktype is set to 1 (Ethernet) - expected ARPHRD_IEEE80211, ARPHRD_IEEE80211_FULL or ARPHRD_IEEE80211_PRISM instead. Make sure RFMON is enabled: run 'airmon-ng start eth1 <#>' Sysfs injection support was not found either. root@myubuntu:/home/myubuntu#

    Read the article

  • iptables, allow access from certain MAC addresses

    - by user788171
    Presently, I limit which clients can access my server by using IP addresses via iptables, only approved IP addresses can connect. However, the problem with this is if a client is on a laptop and goes to a different location, they can no longer connect because the IP has changed. For a variety of reasons, iptables authentication is the only option I have. Is there a way to restrict access by device instead of ip address. For instance, only allow certain MAC address to connect to port 5000. Is it possible to do this via iptables? Note, the computers are not on the same network, they could be connecting from anywhere in the world.

    Read the article

  • Windows 7 cannot burn a DVD-R at 4x or 8x? (it seems to have to use the MAX speed)

    - by Jian Lin
    Windows 7 is great to have the ability to burn an .iso into a DVD-R, but it seems like there is no way to change the speed to 8x or 4x? (so it may go "max", which is 16x) because some of my DVD drives cannot read data all the time if the disc was burned at 16x, that's why usually i limit it to 8x. also, some articles say that disc burned at 4x can last a longer time, so i am tempted to burn something at 4x sometimes. any method or hack that can make it happen?

    Read the article

  • Where would an S3 upload speed cap originate?

    - by CoreyH
    I do a ton of uploading to S3 and am experiencing capped speeds and I can't quite figure out how to address it. The setup: Windows Server 2008 R2 x64, external HD, using a Java based upload tool called Jsh3ll and custom VBS scripts to kick the jobs off. Running one process at a time, I am always limited to about 4mbps. I have FiOS at 35/35mbps speeds, so it isn't an outright limit. AND, I can run parallel instances and can go all the way up to 35mbps, so I know the problem isn't gateway/nic/machine/amazon related. Running parallel instances works to a degree as a solution, but increases the complexity of my workflow greatly. Solving this would make my life dramatically easier. When I was first doing this I was playing around with a bunch of Windows TCP parameters and was able to briefly get unconstrained bandwidth, but it wasn't repeatable. Thoughts?

    Read the article

  • Recommend a web file sharing software please.

    - by Baczek
    I'm looking for a web platform to put company files at. My requirements are: should be accessible via a browser should be open source must be installable (dropbox is a no-go) must have an option to put a access time limit on a file must perform garbage collection automatically after a file expires must be able to mark files as public or private an option to protect a file via a pin-code for users without accounts in the system would be nice to have The problem is I don't even know what to search for - all my googling results in either complete groupware solutions or p2p file sharing software. If such a thing doesn't exist, please don't hestitate to say so, so I can crawl to a corner and cry myself to sleep. TIA

    Read the article

  • Utility for notifying a user that their roaming profile is getting too large to copy before shutdown?

    - by leeand00
    My users are having an issue with their roaming profiles getting too large and then their roaming profile is lost. I believe this is because this is because they are storing too much in their roaming profiles. Is there a program that can be installed in Windows, that will: Listen for a logoff event Check the size of their Roaming Profile against a size limit I set... If the roaming profile is too big, it will notify the user that they have to decrease the size of the profile. Does a program like this exist or does it need to written?

    Read the article

  • How can I make VNC faster?

    - by NickAldwin
    I need to remotely access and use my work computer a few times a week. I want to use VNC because of the price. I've used VNC before, mostly on my own network, where it's fast. However, VNC over the internet is incredibly slow. Even at 256 colors and lower, with Aero turned off, it is unbearably slow. I recently used Ammyy Admin to connect to do something requiring a quick reaction time. Ammyy was really fast, with almost no lag, and it was running in full color with Aero on! How can I make VNC faster, like Ammyy is? I'd use Ammyy, but I would probably run into the 15hr/month limit pretty quickly. Any suggestions? [EDIT] I'm currently using UltraVNC.

    Read the article

  • How can I make VNC faster?

    - by NickAldwin
    I need to remotely access and use my work computer a few times a week. I want to use VNC because of the price. I've used VNC before, mostly on my own network, where it's fast. However, VNC over the internet is incredibly slow. Even at 256 colors and lower, with Aero turned off, it is unbearably slow. I recently used Ammyy Admin to connect to do something requiring a quick reaction time. Ammyy was really fast, with almost no lag, and it was running in full color with Aero on! How can I make VNC faster, like Ammyy is? I'd use Ammyy, but I would probably run into the 15hr/month limit pretty quickly. Any suggestions? [EDIT] I'm currently using UltraVNC.

    Read the article

  • iptables: How to create a rule for a single website that does not apply to other websites?

    - by Kris
    Virtual Dedicated Server hosts 10 websites. 1 firewall made with iptables If one of those 10 websites gets hit by too many ping requests coming from one IP address, how do I limit or drop it without dropping it for the other 9 websites? Do I create a firewall for every website ? If so, how? Or is it better to change my rules? If so, how? Thank you. Original question was posted here iptables: what's best practice when there're several websites but you want to use a rule for a single website? but it was too vague. Let me know if more info is needed.

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >