Search Results

Search found 9916 results on 397 pages for 'counting sort'.

Page 193/397 | < Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >

  • How would you diagnose an integrated network card that doesn't show up on the device list?

    - by Tomer Gabel
    I've been setting up a server based around a Gigabyte GA-G33M-DS2R motherboard (G33, PCH9R with integrated Realtek 811B network device. I've enabled it in the BIOS, but it doesn't show up in the pre-boot PCI device list, isn't recognized by Windows (even after installing the chipset and networking drivers) and doesn't even show up via lspci. I've tried cycling power, disabling, cycling again, enabling, cycling again etc. to no avail. I'm sort of at a loss at this point; the board isn't under warranty anymore, but I'd rather not have to replace it. Any advice would be greatly appreciated...

    Read the article

  • Sendmail problem

    - by trobrock
    I am trying to get my server to be able to send email from PHP. Currently it is using send mail, but whenever I try to send mail to a gmail address I get this sort of response: --o54Mqd5s008981.1275691959/ServerName Content-Type: message/delivery-status Reporting-MTA: dns; ServerName Received-From-MTA: DNS; localhost Arrival-Date: Fri, 4 Jun 2010 22:52:38 GMT Final-Recipient: RFC822; [email protected] Action: failed Status: 5.7.1 Remote-MTA: DNS; gmail-smtp-in.l.google.com Diagnostic-Code: SMTP; 550-5.7.1 [xxx.xxx.xxx.xxx] The IP you're using to send mail is not authorized Last-Attempt-Date: Fri, 4 Jun 2010 22:52:39 GMT How can I set this up to relay through a google account that I own? Is sendmail the best thing to use, or should I switch to Postfix or something? This is on an Ubuntu Server 9.10

    Read the article

  • Is there anything like mail goggles for outlook?

    - by Eugene M
    Gmail has a cool feature that requires user interaction to send an email at certain hours of the day called mail goggles. Is there anything remotely like this for outlook? More specifically, what I'd like is some dialog that has a personal reminder (to make sure certain steps have been taken, like seeing who needs cc'd) and some sort of forced user interaction, like a checkbox. The tool would prevent you from sending he email until you had completed the interaction. If not, would it be possible to make something like this in VBA or as an addin? I realize this sounds like a really annoying feature, but I have a friend who is having a lot of trouble building certain email habits and this would be a sure-fire way to help.

    Read the article

  • What would you use to keep track of all the random information about your LAN/Enviroment?

    - by agnul
    Say you work in your average anarchy... no appointed IT guy, no Office-Manager whatsoever, just developers working in free-for-all mode and a couple of non technical people around for admin (non IT) stuff. What would you use to keep track of Hardware / Software inventory Backups of OS images, Drivers, and all that LAN configuration information Passwords for all the systems Currently I'm thinking of having some sort of wiki + web server on our only ''server-class'' machine, but then I fear I'd be the only one who would go through the hassle of adding information in the wiki... So, have you been through that? What would you suggest?

    Read the article

  • How to enable password challenge in Active Directory?

    - by Antonio Laguna
    As IT Support, my team is taking so much time reseting passwords. So, we thought it would be interesting to enable some sort of Password Challenge in Active Directory so users could reset their own passwords, after correctly answering some questions. Despite we alert users by mail when their passwords are going to expire, they just delete the mail and go on so we think it should be a great idea. I've seen some commercial products but I'm not sure if there is something built-in or GPL to enable this kind of feature. Could someone shed some light about it?

    Read the article

  • How do I make a privileged port non-privileged in Redhat 5?

    - by Jason Thompson
    So I have a RedHat 5 box that I'm wanting to run an application that I wrote that implements SLP. SLP uses port 427 for answering service queries. My understanding is that ports below 1024 are "privileged" and thus cannot be bound to by anyone that's not root. I cannot run this application as root as it is launched via tomcat. One creative solution I really like was simply writing an iptables rule to route the privileged port to a non-privileged. In my proof of concept tests, this works wonderfully. Unfortunately, it would be greatly (and understandably) desired by the powers if my application did not require screwing around with iptables upon installation. So I heard a rumor and cannot find anything to verify this that there was some sort of command or parameter that could be set to make any port I want be non-privileged. Is this true? If so, how is this done? Thanks!

    Read the article

  • Local dedicated hosting space (own hardware)

    - by Scott
    Where can I find local dedicated hosting space for my own hardware? I know I can rent dedicated hosting from various companies online, but usually I think that means I'm renting their hardware too. I just need a space with a network connection and a power outlet. That's it. How much would this cost? What would I search for? Is it available easily? Or would it only be the sort of thing huge companies would do? I'm in the greater NYC area. It's for a project I'm working on, but the thing's loud and annoying. I'd be willing to pay a little to get it out of sight and out of mind. I don't even care too much about the quality of the network connection. I'd rather not rent other people's hardware cause it probably would cost a fortune to rent a machine like this (tons of ram).

    Read the article

  • Reliable 1tb or larger hard drive?

    - by jasondavis
    I am in the market for 2-3 new drives, I would like each to be at least 1tb to 2tb in size. I have been reading all the reviews on newegg.com for 1tb and larger drives and they all have 1 thing in common. Almost all the ones I read about have complaints of them being DOA or dieing within a few weeks of use. I am hoping to find some drives with this storage range that have a reputation for lasting a long time instead of a short life. Please help me if you have any experience with these sort of drives? Most the ones I read about were Western Digital brand. I realize some might complain that this questions answer would be based upon a timeframe, so if a user searches and find this answer a year from now it will be outdated but I would appreciate any help based on the current hard drives available as of April 10th, 2010 on newegg.com

    Read the article

  • Team City Backup from Rest API doesn't backup

    - by ChrisFletcher
    I'm using the REST API in Team City 6.5 to request a backup is created as follows: http://teamcity:8111/httpAuth/app/rest/server/backup?includeConfigs=true&includeDatabase=true&includeBuildLogs=true&fileName=filename as specified in the documentation here: http://confluence.jetbrains.net/display/TW/REST+API+Plugin#RESTAPIPlugin-DataBackup The problem is that it simply returns "idle", as does the backup status call and no backup appears. I can backup fine from the web interface and I can return a list of users from the REST API without issue also. I'm starting to suspect there is some sort of permission or config option but I can't find one.

    Read the article

  • Bash script getting automatically deleted from Ubuntu 12.04 Server?

    - by Kris Anderson
    I'm running a bash script on an ubuntu 12.04 through cron. The script works fine for a few weeks (runs daily backups of websites, mysql databases, and copies to Amazon S3). However, twice now I've noticed that backups stopped happening. Both times the backup script (backupscript.sh) located in my home folder was no longer there. No one else has access to this server, so nothing was manually changed on the server and no one deleted the file by mistake. The cron job (nano /etc/crontab) still references this script, but the script itself disappears. What could cause this to happen? Does Ubuntu delete the script if it runs into some sort of error?

    Read the article

  • i accidentally deleted the recovery folder on a partition (win vista home)

    - by paul
    i accidentally deleted the recovery folder on the recovery partition (win vista home) i think it was some sort of scheduled maintenance of some program that i did not configure properly? oops... lol i called toshiba and they said i needed to buy a recovery program, which i didnt bother doing. I bought a legal copy of vista and would like to install the correct files and in a way that when my computer starts looking for files it will eventually find them or i can point to the partition. i'm pretty sure it's not a matter of copy and paste (is it?) thanks Paul

    Read the article

  • Database backup regardless of backup made through a control panel?

    - by developer
    I know that all CMS or CMC platforms have some sort of walkthrough for their users to fully backup and restore the database or the whole website. While we all can perform such backups (and there are even plugins which automate the whole procedure) and restore them in necessity (such as when we migrate to a new server), one can also backup the whole website by means of such control panels as Directadmin or Cpanel. Now, I just want to know if it is necessary for us to do database or website backups the way they are described by a specific CMC or CMS developer even after we perform a whole website backup in a control panel such as DA? An example of such CMS platforms is moodle. Moodle Docs describes how we can backup and restore moodle in here. So do we, out of necessity, have to make the backup the way described, or we can simply do it the Control panel way? Thanks

    Read the article

  • Windows 7 Ult machine can see XP Pro but not vice versa

    - by Chadworthington
    My new Windows 7 Ult. PC can attach to my work laptop and pull files but my work laptop cannot find my Windows 7 on the network. Is there some sort of limitation going on here? Before I got the new XP machine, my old XP Pro PC could pull files from the XP Pro laptop but not vice versa. The common thread seems to be that the work laptop cannot see other PCs, Windows 7 or not. Could it be because that PC is on a work domain? When I pull files from the work PC, I am prompted for domain credentials, which I provide.

    Read the article

  • Rsync and Windows 7

    - by Nate
    Can someone give me any tips on setting up some sort of Rsync server/client on Windows 7 to run rsync between both my web hosting server, and a backup server that I have running Ubuntu? I've tried setting it up with this tutorial: http://www.youtube.com/watch?v=CvwdkZLNtnA Using copssh, and cwrsync. Ran into all sorts of troubles, including not being able to get cwrsync to run (it installs properly, but never starts up), and copssh not generating the keys at all. The guy was running Windows Server 2003, though, so I'm guessing the problems could just be because I'm running Windows 7. I've been trying to set it up with my Windows machine being the rsync server, and then Ubuntu and my webhosting VPS as the clients, but I realize it may be easier (and make more sense) to just setup the rsync server on Ubuntu, and then an rsync client on Windows 7? Can anyone point me in the right direction? I'm thinking of using this guide: http://www.gaztronics.net/rsync.php It seems a bit outdated, though.

    Read the article

  • Recovering an old website

    - by noah
    I have a client with an old website that somebody setup for him long ago. The guy who set it up is unreachable, so how do we go about trying to take it over? A WHOIS lookup got us some contact information, but I don't have great hopes for that (it hasn't been update in quite some time). The nameservers are ns1.theplanet.com and ns2.theplanet.com, and we will try calling them, but I don't expect we'll be able to get much from them. What are our options? Is there a way I can discover the registrar so we can try contacting them as well? EDIT: It would be sufficient if we could get control of the domain name or put in some sort of redirect to the new site. Either hosting was prepaid for quite some time, or someone else is still paying for it, so we don't care about that.

    Read the article

  • How to automatically set default quota limits for users on XFS filesystem, when the new account is created

    - by acidburn2k
    I guess the title explains the problem pretty well. Do you have an idea for a mechanism, which will automatically assign default quota values for every new account created (sort as the skel scheme works, but in this area)? Now, I am looking for a generic clean solution, not some ugly cron based scripts, or wrapper scripts for creating users. I would also like to avoid any external, unmaintained stuff (like forgotten pam modules, and such). Anything what could lead to overhead and extra work in future isn't really the solution, nor is checking for new accounts every minute.

    Read the article

  • Understanding connection tracking in iptables

    - by Matt
    I'm after some clarification of the state/connection tracking in iptables. What is the difference between these rules? iptables -A FORWARD -m state --state ESTABLISHED,RELATED -j ACCEPT iptables -A FORWARD -m conntrack --ctstate ESTABLISHED,RELATED -j ACCEPT Is connection tracking turned on when a packet is first matched containing -m state --state BLA , or is connection tracking always on? Can/Should connection state be used for fast matching like below? e.g. suppose this is some sort of router/firewall (no nat). # Default DROP policy iptables -P INPUT DROP iptables -P OUTPUT DROP iptables -P FORWARD DROP # Drop invalid iptables -A FORWARD -m state --state INVALID -j DROP # Accept established,related connections iptables -A FORWARD -m state --state ESTABLISHED,RELATED -j ACCEPT # Allow ssh through, track connection iptables -A FORWARD -p tcp --syn --dport 22 -m state --state NEW -j ACCEPT

    Read the article

  • Why some recovery tools are still able to find deleted files after I purge Recycle Bin, defrag the disk and zero-fill free space?

    - by Ivan
    As far as I understand, when I delete (without using Recycle Bin) a file, its record is removed from the file system table of contents (FAT/MFT/etc...) but the values of the disk sectors which were occupied by the file remain intact until these sectors are reused to write something else. When I use some sort of erased files recovery tool, it reads those sectors directly and tries to build up the original file. In this case, what I can't understand is why recovery tools are still able to find deleted files (with reduced chance of rebuilding them though) after I defragment the drive and overwrite all the free space with zeros. Can you explain this? I thought zero-overwritten deleted files can be only found by means of some special forensic lab magnetic scan hardware and those complex wiping algorithms (overwriting free space multiple times with random and non-random patterns) only make sense to prevent such a physical scan to succeed, but practically it seems that plain zero-fill is not enough to wipe all the tracks of deleted files. How can this be?

    Read the article

  • How to Access an AWS Instance with RDC when behind a Private Subnet of a VPC

    - by dalej
    We are implementing a typical Amazon VPC with Public and Private Address - with all servers running the Windows platform. The MS SQL instances will be on the private subnet with all IIS/web servers on the public subnet. We have followed the detailed instructions at Scenario 2: VPC with Public and Private Subnets and everything works properly - until the point where you want to set up a Remote Desktop Connection into the SQL server(s) on the private subnet. At this point, the instructions assume you are accessing a server on the public subnet and it is not clear what is required to RDC to a server on a private subnet. It would make sense that some sort of port redirection is necessary - perhaps accessing the EIP of the Nat instance to hit a particular SQL server? Or perhaps use an Elastic Load Balancer (even though this is really for http protocols)? But it is not obvious what additional setup is required for such a Remote Desktop Connection?

    Read the article

  • Best DSL hardware for ADSL Troubleshooting

    - by Jeff Sacksteder
    I have a situation where I need to make the best of a bad DSL situation. The CPE is a black box with no access to DSL diagnostics. My plan is to get some sort of DSL hardware that exposes link-layer state and gives me knobs to tweak. I'd like to be able to mitigate bufferbloat as much as I can while I'm at it. The obvious choice would seem to be a Sangoma card in a linux system. I have no way of knowing if that will do anything for me without testing it, however. I have no other access to WAN troubleshooting equipment. Are there any other options avail to me as a consumer?

    Read the article

  • Google Bookmarks thinks I'm in Portugal

    - by Aeo
    I was looking for a convenient way to keep links to web sites between multiple computers on Monday. After a bit of searching, I found out Google offers a service that covers my desires just fine, Google Bookmarks. Fast forward to today, Wednesday afternoon, I loaded up Google Bookmarks to find this: Sorry, I don't speak Portuguese. I cacheless-refreshed my gmail tab to make sure something wasn't up with my whole account, and it's fine. Hitting the translate button works... ...sort of. It works enough to at least know what's going on. I can work with this, but... It really seems backwards. How do I tell Google I don't understand Portuguese in my Bookmarks?

    Read the article

  • Catch-all DNS record

    - by Christian Sciberras
    Intro Our users have the ability to buy a domain (eg: user1.com) and make it point to out website, (eg: example.com), by simply pointing user1.com to ns1/ns2.example.com . Issue So far everything's good, however, example.com does not like this; we need to set up WHM/cpanel to make the server accept user1.com . Problem is, we'd rather made this automatic, possibly without having to use WHM API. The question We need some sort of "catch-all" wildcard entry so that we capture all of our user's possible domains.

    Read the article

  • How Do You Stress-Test Your Hard Drives?

    - by MetaHyperBolic
    When looking for large new drives (= 1 TB) on newegg and the like, I note a number of reviews talking about drives being either D.O.A. or hitting the Click of Death (or even releasing the Magic Smoke) within a week or so of use. A portion of the reviews mention this phenomenon whether the drive in question is Western Digital or Hitachi or whatever. For those of you using Windows, what do you to: 1) Place a large initial stress on the drive to see if it can take it? For how long? 2) Test the drive afterwards (presumably with some sort of S.M.A.R.T. tool or others) to see if any negative changes have been noted? Note: This is one component of a larger plan for both high-availability and backups for my home data.

    Read the article

  • Can't mount home after trying to resize (bad geometry: block count exceeds size of device).

    - by Lynn
    This is on a fresh computer (super computer actually). It got to me with 15T on the home mount and 50G on the root. I tried allocating 7T to root and resizing (since I'm putting a local yum repo on this machine as it has no internet access nor will it ever). I tried following the instructions here: Centos 6.3 disk space allocation but something went wrong and the home won't mount again. Instead I get from dmesg | tail: EXT4-fs (dm-2): bad geometry: block count 4294967295 exceeds size of device (1342177280 blocks) df -h nets this output: Filesystem Size Used Avail Use% Mounted on /dev/mapper/VolGroup-lv_root 7.0T 3.6G 6.6T 1% / tmpfs 190G 216K 190G 1% /dev/shm /dev/sda1 485M 38M 422M 9% /boot I didn't have any files on /dev/mapper/VolGroup-lv_home. Will simply running mke2fs fix it to be mountable? What sort of options should I run it with. I've never resized volumes before or used mke2fs. I don't want to make this mess worse.

    Read the article

  • Chrome Crash Investigation

    - by iamcreasy
    Chrome is crashing very very frequently(every 2-3 min). It becomes irresponsive. How can I start investigate why this is crashing so much? It feels to me that certain components of some web pages are triggering the crash. I also checked "C:\Users\irfan\AppData\Local\Google\CrashReports", but this folder is empty. Some sort of process tacking tool, and keep an eye out for which request is being made just before the crash, or something like that. Any software suggestion? Im using Windows 7. Please don't suggest, reinstall chrome. I want to know why this is happening.

    Read the article

< Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >