Search Results

Search found 10815 results on 433 pages for 'stored procedure'.

Page 237/433 | < Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >

  • IIS permissions issue pointing docroot to Samba share

    - by lalalalalalalambda
    I have an IIS project which is stored on a Samba shared, network mounted with the following line: X: \\my-samba-server\dev /user:freddie Connectivity is fine, can read/write files from X:. In IIS, I'm trying to set it as the Physical path via \\my-samba-server\dev\folder\to\my\files, which results in the following 500.19 error: Config Error | Cannot read configuration file due to insufficient permissions It is by default trying to use the Pass-through authentication. If I try to set this to connect as the specific user freddie, I receive: The specified user does not exist What is the correct way to connect to a path which has been setup as described above? *Samba man pages indicate version 3.6 is on the Debian host

    Read the article

  • Outlook 2010: When sending message on behalf of someone else, store that message in the other person's Sent Items folder

    - by Helge Klein
    We have an e-mail account for support purposes which is tended to by multiple members of the team. When answering a support e-mail we obviously choose the support account as sender. Still, the answer is not stored in the support account's Sent Items folder, but in the Sent Items of the person actually answering. This behavior, which seems to be by design, prevents others from gaining access to the entire conversation and potentially causes multiple answers. I am looking for an automated way of moving e-mails sent on behalf of someone else to that person's Sent Items folder. I tried to create a rule for this but could not find the right setting.

    Read the article

  • What happens to a Windows Dynamic Drive

    - by GruffTech
    In my windows (Windows 7) I have two primary logical volumns, One is on a SSD harddrive for my operating system and installed software, and my other is a Dynamic volume for my stored media. (I do alot of work with HD footage) I have all my media on the original DV tapes for backup purposes, just having them all available on my harddrive at all times is a major convience for me, well worth the few hundred dollar investment those 2TB drives were. Anyway, Long story short is my windows install has become problematic and I want to reformat windows. Does this, or will this effect my dynamic drive in any way? I've got almost 3 TB of video on there and i really dont want to re-import all my DV tapes.

    Read the article

  • Shared resources in Windows Server 2008 were lost

    - by user316687
    We have an Oracle database in Windows Server 2003, which has its archived redo logs stored on a shared resource of a Windows Server 2008: \\192.168.1.189\d$\folder_for_archivedlogs However, according to Oracle's alert.log, at 10:01 p.m that shared resource got lost and the database was inaccessible. From my Windows Server 2003, on Windows Explorer, I couldn't access that shared resource, but I got a response when I did ping 192.168.1.189. I reviewed all the Event Logs on that Windows 2008, but there is no error at 10:00pm or 11:00pm. Has anyone seen some similar case before? (Shared resources get lost, but you still can ping the server and there are no error events in the Event Logs).

    Read the article

  • How does an OS communicate with other hardware components?

    - by Jack
    How can a program running on a CPU (mostly OS) access other PC hardware? Such as Graphic card, HDD and so? From what I read, in DOS, this was done using BIOS calls, specifically the INT instruction. But, the INT instruction should only jump to the certain space in RAM. So how can some program stored in RAM access other computer hardware, when the CPU can only access RAM, and receive interrupts? Does Windows use INT instructions as well, or is there a new way to communicate with the hardware?

    Read the article

  • What kind of SSL Cert do I need and where do I get it?

    - by chacham15
    I want to have subdomains with SSL within my domain. The main difference is that each subdomain is hosted by a different person with a different public key/private key pair. Let me illustrate with an example: User send his public key and requests subdomain from foo.com User is added and assigned subdomain bar (bar.foo.com). Users public key is stored for future validation against bar.foo.com User goes to bar.foo.com and see's a validated SSL connection. From what I gather, this means that I need to create a CA, which is fine. The problem is that from what I recall, a CA needs a special sort of SSL Cert. How do I go about getting this?

    Read the article

  • Windows Login Failure

    - by Chris Bateson
    I'm getting an error in the Event Viewer, which is also generating a lot of Logon Failure messages on our syslog server. Pretty much stuck on how to resolve. EventID: 536 Logon Type: 3 Reason: The NetLogon component is not active This is for a Windows Server 2003 system. I have checked here We're using Shavlik Protect 9 to scan and deploy patches. Shavlik stores the credentials for the systems and uses those stored credentials to deploy patches. This system is able to scan and deploy to other systems on the network using those credentials and no errors are generated. When installing to the local system that Shavlik is physically on then this error is generated. Whats interesting is that it doesn't generate during a scan, and the patches install fine. We've contacted Shavlik to get the response that they are unable to help since it's a Microsoft error. Has anyone seen this?

    Read the article

  • Best way to find the computer a user last logged on from?

    - by Garrett
    I am hoping that somewhere in Active Directory the "last logged on from [computer]" is written/stored, or there is a log I can parse out? The purpose of wanting to know the last PC logged on from is for offering remote support over the network - our users move around pretty infrequently, but I'd like to know that whatever I'm consulting was updating that morning (when they logged in, presumably) at minimum. I'm also considering login scripts that write the user and computer names to a known location I can reference, but some of our users don't like to logout for 15 days at a time. If there is an elegant solution that uses login scripts, definitely mention it - but if it happens to work for merely unlocking the station, that would be even better!

    Read the article

  • Postfix/dovecot remove LDAP user

    - by dove221
    I have to remove or blacklist an LDAP/dovecot user. The authentication is setup from active directory what I cannot manage so I thought there should be a way at least to disable this specific user on the mailserver locally. # Virtual Accoutns - LDAP - MS AD virtual_mailbox_maps = ldap:/etc/postfix/ldap_mailbox_maps.cf virtual_alias_maps = ldap:/etc/postfix/ldap_alias_maps_redirect_true.cf ldap:/etc/postfix/ldap_alias_maps_redirect_false.cf ldap:/etc/postfix/ldap_mailbox _groups.cf virtual_mailbox_domains = domain.com virtual_uid_maps = static:1000 virtual_gid_maps = static:1000 virtual_transport = dovecot dovecot_destination_recipient_limit = 1 Anybody knows how to do it? I followed this guide for disabling 1 user through postfixes access file: http://www.cyberciti.biz/faq/howto-blacklist-reject-sender-email-address/ Unfortunately it doesn't work. It's like the settings stored in LDAP are overruling the access rule. Instead of postfix rejecting the mail it keeps accepting it. Thanks!

    Read the article

  • How to push to github from a server account with multiple users?

    - by kirdie
    We have a web server which contains a web application stored as a github project. Now all of us can push from our local machines to github and then pull on the server but sometimes we want to make small changes and immediately see the effect so it would be great to be able to push at the server too. Now I created an ssh key for the server but I don't want to add the servers ssh key to my github account because then all github actions done from the server are counted to my account. Is it possible to add the ssh key to the github web application project without creating a new user for the server and what is the best practice for this situation? I also don't want to copy my private key to the server obviously.

    Read the article

  • 1 iPhone, 2 (potentially 3) Macs......& Google Apps Sync?!

    - by Goober
    Currently I use a beast of a G5 Mac at work for all my software/web development, an iMac G5 at home for personal use & a MacBook Pro for when i'm on the go. At work we make heavy use of all the Google Apps features such as calendar & docs etc. We all have an address book (local mac application) that gets synced with a list of contacts stored on our mail server. I want to be able to integrate the Google Apps calendar with my iPhone calendar (or even just iCal - which i can then sync to the iPhone). Essentially I want all three Macs to have the Google calendar synced to their version of iCal including the iPhone. I've heard that Google Sync handles something along these lines but i'm unsure if it's going to fulfill my needs? I'm trying my best to avoid having to use MobileMe..... Help greatly appreciated, Kind regards

    Read the article

  • sp_releaseapplock timeout expired cause?

    - by Darian Miller
    I've been using a combination of sp_getapplock and sp_releaseapplock for some custom application locking purposes for years now with success and just the last few days I've started to get some timeout expired errors on a sp_releaseapplock which is a bit puzzling. When listing the current locks, there's less than a dozen of these active, and the rest of the dedicated server is way underutilized at the moment (less than 100 batches/sec with a mutli-processor, 32GB Ram, higher end machine.) Is there a specific resource to be monitored that may point me in the right direction for determing why such a lightweight operation is timing out? This is called within a stored proc with a timeout of 120 seconds which seems to be amazingly long for this operation to timeout on. SQL 2000 SP4 running on Windows 2003 Server. TSQL used (@pLockUniqueName is VarChar(255)) EXEC @pLockSuccess = sp_getapplock @pLockUniqueName, 'Exclusive', 'Session', 0 EXEC @pUnLockSuccess = sp_releaseapplock @pLockUniqueName, 'Session' Thanks, Darian

    Read the article

  • Can KVM roll back changes to Virtual Disks automatically?

    - by Cygon
    I'm currently using VirtualBox on my Linux server to run a small Windows guest OS. I've configured its main virtual hard drive as what VBox calls "Immutable" - meaning that any changes to it are written into a differencing image that is discarded when the system reboots. Can KVM do something similar? I've read about snapshots via "savevm", "loadvm" but I believe that's saved states, not differencing images. What I ultimately want is a VM with two drives: one reverts on each reboot, one keeps its changes. Ideally, the unchangeable drive image should be stored with only read access granted to the user running KVM.

    Read the article

  • Inexpensive degaussers or HDD shredders?

    - by Nicholas Knight
    I do a lot of work for a small cash-strapped business that has a lot of active hard drives, most are consumer-grade SATA of about five years of age, and predictably they are dying at an increasing rate, and a lot of the time they can't even be detected, let alone complete a zero-out cycle. Right now those drives are just being stored, but that can't continue forever. We've got a couple bad LTO tapes it'd be nice to deal with, too. There are very real security and legal issues that make dropping them off with someone who claims they'll be properly destroyed a gamble. I've looked around at degaussers and HDD shredders, and the ones that don't look like they come from some guy in his basement all seem to be $3000+, which is hard to swallow right now. Is there anything out there in the $500-1500 range that you would recommend? (Speed isn't a big issue, if it takes several minutes or even hours per drive, that's completely OK, we've only got 10 or so thus far.)

    Read the article

  • Windows Advanced Firewall certificate based IPSEC

    - by Tim Brigham
    I'm working on migrating from using IPSEC settings stored under the 'IP Security Policies on Active Directory' to using the 'Windows Firewall with Advanced Security' for my 2008+ boxes. I have successfully been able to get this set up using Kerberos authentication, however my openswan implementation on my Linux boxes is using certificates. Whenever I try changing the authentication method to computer certificate (using RSA and my root CA) the connection is bombing out. I've made this change at both a connection request policy and on the IPSEC settings on the root Windows Firewall with Advanced Security node. The windows event log shows the authentication request is taking place but failing negotiating a mode. What am I missing here?

    Read the article

  • Portable and Secure Document Repository

    - by Sivakanesh
    I'm trying to find a document manager/repository (WinXP) that can be used from a USB disk. I would like a tool that will allow you to add all documents into a single repository (or a secure file system). Ideally you would login to this portable application to add or retrieve a document and document shouldn't be accessible outside of the application. I have found an application called Benubird Pro (app is portable) that allows you to add files to a single repository, but downsides are that it is not secure and the repository is always stored on the PC and not on the USB disk. Are you able to recommend any other applications? Thanks

    Read the article

  • How to properly backup mediawiki database (mysql) without messing up the data?

    - by Toto
    I want to backup a mediawiki database stored in a MySQL server 5.1.36 using mysqldump. Most of the wiki articles are written in spanish and a don't want to mess up with it by creating the dump with the wrong character set. mysql> status -------------- ... Current database: wikidb Current user: root@localhost ... Server version: 5.1.36-community-log MySQL Community Server (GPL) .... Server characterset: latin1 Db characterset: utf8 Client characterset: latin1 Conn. characterset: latin1 ... Using the following command: mysql> show create table text; I see that the table create statement set the charset to binary: CREATE TABLE `text` ( `old_id` int(10) unsigned NOT NULL AUTO_INCREMENT, `old_text` mediumblob NOT NULL, `old_flags` tinyblob NOT NULL, PRIMARY KEY (`old_id`) ) ENGINE=InnoDB AUTO_INCREMENT=317 DEFAULT CHARSET=binary MAX_ROWS=10000000 AVG_ROW_LENGTH=10240 How should I use mysqldump to properly generate a backup for that database?

    Read the article

  • mailsend not sending to (or qmail not receiving from) the same machine

    - by roman
    A web applications sends two emails: to the user of the webapp to the administrator the administrators mailbox (qmail) is on the same machine as the web application (php, apache, /usr/sbin/sendmail). email 1 works, email 2 sometimes doesn't work. I don't see any pattern in the mails that don't work. Also because I don't exactly know WHICH emails failed (since the email itself would be the only notification). email 2 looks like this: from: <[email protected]> #changes for each user to: <[email protected]> What could be the problem? Are rejected emails stored somewhere? (if they are rejected.. how do I check this?)

    Read the article

  • Moving Away from Exchange public folders - Export to file system folder?

    - by Mr. Monkey
    I have a public folder that was used for the wrong reason. Due to some regulations we had to store lots of photos, we're talking at least 7000 photos that are stored based on a location of stores. So for example, each store would send in an email with at least 2 photos of their location, that email would contain their location name or number, and those photos, so there was some sort of organization for it. I would love to move the contents of that public folder to a normal windows folder we could share on a server. Is anything like that possible? Anybody have other ideas?

    Read the article

  • In CentOS 4.3 Webmin 1.3000 bandwidth monitoring is eating disk space. How to delete those files?

    - by Silkograph
    I maintain Linux server being used for Mail, Squid and DNS service. Recently I observed that something was eating server disk space. But at last, today I caught the culprit which was consuming the disk by storing large number of files. On this server, Webmin 1.300 is installed. We use Squid proxy and Sarg to monitor Internet access. I always manually clear Sarg generated files under /var/www/html/squid for last few years. But I never realized that Webmin is also storing some kind of bandwidth log files in its' directory structure. I have noticed that under /etc/webmin/bandwidth/hours it has stored more thousands of files since year 2007 totaling about 17 GB. We have used 40 GB HDD for this server machine. My question is how can I delete those (/etc/webmin/bandwidth/hours) files safely?

    Read the article

  • How do I fix this error? Windows server 2003 the application failed to initialize properly (0xc0000022)

    - by Sharon
    Opening one of the programs from the user desktop I get the above Aplication error. It is a proram stored on the server and then the icon put on the users desktop (this is how I was told to do it) but it does not run the application. I don't know anything about group policies etc and can just about manage to add users in the active directory and that is it. We just have a folder which we drop the program icons into. Any ideas? I must be doing something wrong as it doesn't always show up on their desktop either. What is the simplest way to do this? Thanks

    Read the article

  • backup and file server for 50+ TB of data

    - by a-bomb
    our office wants to build a new server to handle our data, over the last 10 years our data was stored on CDs, DVDs, HDDs but now they want all of it in one place that is attached to the network for everybody in the office to access it. the data is 20TB new data and the rest is old, the important now is to store these 20tb and gradually store the other 30tb over time. so what is the best solution to do ? we thought of getting an hp server and connect it to an external enclosure that either had tape drives or HDDs (we haven;t decided yet) or to get a NAS server and connect it to the hp server. what should we do because this is new for us ...

    Read the article

  • VMware + SQL Server - sqlserver.exe not using both CPU cores

    - by fistameeny
    Hi, I am working on a virtual machine that runs SQL Server Express (as part of Sage Line 50 Manufacturing). The details are as follows: Physical Server (host machine) - Intel Xeon Quad Core 2.1GHz - 4GB RAM - VMDK image stored on RAID-5 500GB SATA drives (7200RPM) - Running Ubuntu 10.04 Server 64 bit - VMware Server 2 Virtual Machine - Windows Small Business Server 2003 - Allocated 2 vCPU's and 2GB RAM - Using 100GB pre-allocated flat VMDK file The problem I have is that there is process that runs in SQL Server that is CPU intensive. On the old physical server that we migrated to the virtual machine from, this would utilise both CPU cores so the sqlserver.exe process would be running 100% on each of the CPU cores. On the virtual machine, it only seems to use one of the two CPU cores, meaning that the process is much slower to run. Question Is there a way to force SQL Server (sqlserver.exe process) to use both of the CPU cores, and distribute it's load between them? Is this a VMware setting that needs changing to allow processes to use both cores?

    Read the article

  • How to efficiently dump a huge MySQL innodb database?

    - by Jagbir
    I got an Ubuntu 10.04 production MySQL database server where total size of database is 260 GB while size of root partition is itself 300 GB where DB is stored, essentially means around 96% of / is full and there's no space left for storing dump/backup etc. No other disk is attached to server as of now. My task is to migrate this database to other server sitting in different datacenter. Question is how to do that efficiently with minimum downtime? I'm thinking in line of: Request to attach an extra drive to server and take a dump in that drive. Transfer dump to new server, restore it and make new server slave of existing one to keep data in sync When migration is needed, break replication, update slave config to accept read/write requests and make old server read-only so it won't entertain any write requests and tell app developers to update there config with new IP address for db. What's your suggestions to improve this or any alternate better approach for this task?

    Read the article

  • RDP add domain users broken

    - by Robuust
    I have 3 servers, - domain controller with dns services - dhcp/rras - file/random server with files stored on it and nothing special so far. All servers have static IP's All servers are in the same domain (SOFTWARE) RDP is enabled for all 3 servers All servers are Windwos Server 2008 R2 I can connect to the DHCP/RRAS server via RDP I cannot connect to DC and File server When I add RPD users (both are domain admin for testing) to the File server they show up like this: What is happening what I don't see? And additional why don't I even get a login screen for RPD? Thanks in advance.

    Read the article

< Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >