Search Results

Search found 103805 results on 4153 pages for 'am poru'.

Page 239/4153 | < Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >

  • Problem opening password encrypted .docx file on Word 2003

    - by molecule
    Hi all, I am having a problem opening a .docx file on my Word 2003. I have installed the Compatibility pack for 2007 but when i try to open this particular file, I receive the error "Word experienced an error trying to open the file. Try these suggestions. 1. Check the file permissions for the document, 2. Make sure there is sufficient free memory and disk space, 3. Open the file with the Text Recovery converter. I do not think it is any of the errors as I am able to open it on a different PC with Word 2003 as well. I also do not have any issues opening any non-password encrypted .docx files. Has anyone experienced the same issue? Most posts on the internet relate to "open and repair" but as mentioned, I am able to open this file on another PC without any problems. Any advice is greatly appreciated. Thanks, George

    Read the article

  • Intel RAID0 on Windows 8 not Displaying Correct Media Type

    - by kobaltz
    I have my primary C Drive which consists of 2 Intel 120GB SSD Drives in a RAID0. I have a clean install of Windows 8 Pro, latest MEI software, latest RST software, latest Intel Toolbox. Prior to this I had installed Windows 8 Pro as an upgrade. When I went into the Optimize Drives while in the Upgrade installation, it showed the Media Types as Solid State Drives. However, now since I am in a brand new install, it is showing the Media Type as Hard Disk Drive. I am worried about this because of the trim not working properly. Before when in the upgrade, it showed SSD as the media type and the Optimize option would perform a manual trim. Unfortunately, my search credentials on Google are so common to many other things (ie Raid0, SSD, Windows 8, Media Type) that all I am finding are useless topics. Before, (found on random site) it showed the Media Type as below

    Read the article

  • How long do uploaded files stay in the tmp folder in Linux Ubuntu?

    - by Jean-Nicolas Boulay Desjardins
    I am building a web application where my users will be able to upload files. After the files are uploaded I need to send the files to two other servers, and after they will be deleted from the server where they were just uploaded to. I am wandering is it a good I idea to keep the uploaded files in the tmp/ folder the time the uploaded files are sent to the other two servers or should I move them to another folder incase they get deleted? I am also wandering because I would like to know if I have to build a cron script to get rid of the files that have been transfered to the other servers so that I get my disk space back.

    Read the article

  • website is accessible through dns1. but not with WWW

    - by Pushpendra
    i have a domain and i am using freehostia as a web hoster. In the name server of domain i have registered the name of both servers of freehostia. And in my control panel in the hosted domain section its showing "1 Hosted Domains / 1 Domains Listed".however on clicking on that its showing an error that "The selected domain name has not been registered yet. Please register it from the Domain Manager section first" now whenever i am trying to access my website by using dns1. its accessible but when i am using WWW. its not accessible for example my domain name is example.com if i will type dns1.example.com then my webpage will open but when i type www.example.com its showing "Oops! Google Chrome could not find www.example.com" And for information 24 hours has been passed since i have registered my name servers.

    Read the article

  • This weird behaviour from cronjob

    - by The DOCTOR from TARDIS
    I have set the crontab like this: */5 0 * * * /www/permitChat.sh and the /www/permitChat.sh is this: # We are setting the name of file # in the variable along with complete path. sFilePath=`date +\/www\/ChatLogs\/%Y\/%m/%d_%m_%Y.txt` # First we set its permissions to # readable by all users, and then # modify them to be writable by only root. chmod a=r $sFilePath chmod u+w $sFilePath ls -lh $sFilePath The trouble I am facing is, the cron gets executed after 12:00 PM everyday, instead of executing at 12:00 AM to 01:00 AM, every 5 minutes. What could be wrong? All my system variables appear to be synced.

    Read the article

  • How to minimize services in ubuntu

    - by codeomnitrix
    Hello everyone.I am using ubuntu from last 6-7 months. I am having 512mb ram and a p4 processor with 1.73 Ghz processing speed. And being a programmer i have to work with eclipse and netbeans like ide's, and they sometimes hang. So is there any option in ubuntu to stop the services running just like i do have in windows "msconfig" or mycomputer-manage, and where could i find the details about the services so that i should know what will be its effect if i stop this service. I am using ubuntu 10.10 Thanks

    Read the article

  • Malzilla Tutorial #4 - how to get it working

    - by Sim
    I am trying to understand the basic work flow of using Malzilla , and therefore doing the tutorials. But already in the first one, where you actually get to do something I am lost as it doesn't work for me at all, even though I follow it step by step. In the first paragraph where you have to concatenate the string and then translate it using the MiscDecoders->DecodeUCS2, I am getting not printable chars as a solution, after concatenating and using the decoder. In the second paragraph where you are told to run the script after adding the function-call at the bottom of the script, the Decoder tells me that he cannot handle the arguments.callee.toString() method and therefor wont run the script. How outdated are those tutorials and how can I get them to work?

    Read the article

  • Is it possible to spoof the From: field in Outlook?

    - by tsv
    I am wondering if it is possible to change the From: field (not just the reply-to) in Outlook (specifically in the 2010 beta, but also interested in other versions). I am just moving over from Linux and am used to being able to do this quite simply in some clients. At first I thought it was the option of "Other E-Mail address" on the From drop down menu in a compose window, but that seems to do... nothing! I want to be able to do this so I can have email look like it comes from a domain I own but do not wish to run an SMTP server on.

    Read the article

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • using wbadmin to backup and recover

    - by g7rpo
    HI I am using wbadmin to perform backups of a specific folder, primarily to backup my VHD files this is working fine but I tried to recover the files today using a different machine to the one which created the backup and couldnt get the machine doing the recovery to 'see' the backups. Is there a way to do this as my worry is that if I have a failure on the host which is perfmorming the backups I need to be able to install hyper-v on another host and recover the backed up VMs to there until I can rebuild the host. It appears that this isnt possible, I am hoping I am missing something. Any help would be greatly appreciated.

    Read the article

  • DNS subdomain problem - Hover.com

    - by Ryan Sullivan
    I use hover.com to manage my domain names. I have having a huge problem with setting a sub-domain to a specific IP address: I want the sub-domain on a particular domain name that I have. I set an A type record for that sub-domain and pointed it towards the IP address; it is not working at all. The thing that is confusing me is that when I set the IP address to a sub-domain on a different domain name it works just fine. Also, I have since deleted the DNS record from the domain that it happened to work on, and when I type that address into a browser it still resolves to the IP I had it set to. I am not sure what is going on at all. If this seems confusing I am sorry, but I am very confused about the whole thing myself. If any clarification is needed, just ask and I will try to clear things up.

    Read the article

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • How soon does nginx's token bucket replenish when limiting at requests per minute?

    - by Michael Gorsuch
    We've decided that we want to experiment and limit requests per minute instead of requests per second on our sites. However, I am confused by the burst parameter in this context. I am under the impression that when you use the 'nodelay' flag, the rate limiting facility acts like a token bucket instead of a leaky bucket. That being the case, the bucket size is equal to the burst parameter, and every time that you violate the policy (say 1 req/s), you have to put a token in the bucket. Once the bucket is full (being equal to the burst setting), you are given a 503 error page. I am also under the impression that once a violator stops going against the policy, a token is removed from the bucket at a rate of 1 token/s allowing him to regain access to the site. Assuming that I have the above correct, my question is what happens when I start regulating access per minute? If we chose 60 requests per minute, at what rate does the token bucket replenish?

    Read the article

  • Openfire Installation Issue - Can't Login to admin panel

    - by Lobe
    I am trying to get Openfire to install on an Ubuntu virtual machine, however upon completing the web based installer, I am unable to login to the admin panel. So far I: downloaded Debian installer Installed using stock options Added database and built the structure using supplied SQL file Completed web based installer I am now trying to login using username: admin and my password, however I constantly get a wrong username/password error. There is a record generated in the MySQL database showing the admin user with an encrypted password, and changing to an unencoded password doesn't work. What is the problem here?

    Read the article

  • Redirect Web Subfolder to Root (/folder to /)

    - by manyxcxi
    I am trying to redirect /folder to / using .htaccess but all am I getting is the Apache HTTP Server Test Page. My root directory looks like this: / .htaccess -/folder -/folder2 -/folder3 My .htaccess looks like this: RewriteEngine On RewriteCond %{REQUEST_URI} !^/folder/ RewriteRule (.*) /folder/$1 What am I doing wrong? I checked my httpd.conf (I'm running Centos) and the mod_rewrite library is being loaded. As a side note, my server is not a www server, it's simply a virtual machine so it's hostname is centosvm. Addition: My httpd.conf looks like so: <VirtualHost *:80> ServerName taa.local DocumentRoot /var/www/html SetEnv APPLICATION_ENV "dev" Alias /taa /var/www/html/taa/public <Directory /var/www/html/taa/public> DirectoryIndex index.php AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • windows hosted networok on windows 8

    - by tanmaysingh
    my sister is using Acer laptop, that has an intel WLAN card and supports windows hosted network(checked in command prompt).Am using the basic way using netsh wlan command also, then am using netsh start wlan command.The hosted network gets created and it could b seen in "Network and Sharing Center" as the . I also went to Device manager, selected WLan card, made sure"allow this device to wake computer from sleep" is selected. Now I went to the newly created , clicked properties, and under sharing enabled allow this to be shared(but I am not able to select The name of the wifi my sis wants to share (in this case her college wifi name, LEts say QP4,Ethernet and Local Area connection are the only two option that are coming). Also the SSID name is not being shown in her mobile nor when I click the tower icon at the tray area,Any suggestions on what might be going wrong?She is using windows 8. Should I look for updated drivers? Her laptop is only 6 months old, i don't think the drivers are outdated. Any advice shall be appreciated.

    Read the article

  • Disable RAID Controller

    - by B.Mr.W.
    I have some decent HP Proliants server that come with "HP Smart Array P410i Controller" enabled, I am using these boxes to set up a Hadoop cluster and I know, RAID is for sure a no-no for Hadoop since the application itself will take care of data redundancy and extra intelligence provided by RAID won't be helpful and might turn down the performance. I tried to disable the devices at the BIOS and the box cannot even access the disk afterwards. So I am assuming the controller is sitting between disks and mother board, and we have to turn it on and configure it to "level0" or something like that. I am wondering what should I do to "disable" the RAID functionality so it will fit into the Hadoop environment.

    Read the article

  • Linux install error " dracut Warning: Can't mount root filesystem. "

    - by NBB
    I am installing Fedora 16. I just insert CD to install Fedora 16 in my laptop however, I am getting this error like "dracut Warning: Can't mount root filesystem." http://cfile7.uf.tistory.com/image/176BAA3C4EBF9F89051FA7 <--- like this I am not really sure how to fix it. (this is the first time to install Fedora 16 in my laptop) In my laptop , I previously installed Windows 7 Professional. I have not install any kind of Linux before. Does anyone know how to fix this problem ?

    Read the article

  • Datacenter Backup Strategy

    - by EasyEcho
    What are common approaches to backup solutions in remote data centers? I am already familiar with general backup principals and have a very good backup strategy for our local data center but am having great difficulty extending it to a remote data center. We currently do a full backup on Friday, differential Mon - Thu, rotate offsite Friday morning ...rinse and repeat week after week. BTW, we use disks and have been very happy with this approach. We could buy a large storage server and backup everything to it, but this solution doesn't give you offsite. We could encrypt and upload to Amazon or some other online storage but that would take a large amount of time given the data and would be rather expensive paying for the bandwidth leaving the data center and receiving at amazon. We could drive to the data center every Friday and continue to rotate disks as we do now. But that just seems old fashion. What am I missing, are there better options?

    Read the article

  • Setting up Ubuntu Server for hosting Java web applications

    - by Denis Hoss
    I'm trying to set up an Web Server running Ubuntu server to host some Java web applications, with MySQL running on it, an so on .. here is the tutorial I follow: perfect server ubuntu 11.10 The server configuration is: CPU S1155 INTEL Pentium G850 2.9GHz VGA 5GTs 3MB 65W MB Gigabyte GA-H77-D3H Ram 4x4Gb HDD 5x1TB Seagate (4 in RAID5 and 1 for Backup) The problem is that when I am trying to install the Server version of Ubuntu, when the installer asks me whether to activate ATA RAID Devices, and I click yes, he sees only that one, if I click no, he sees all 5 HDD's separate without any RAID, is this normal? I also tried to install the Desktop version on RAID5, but after restart, Ubuntu does't want to boot up, an underscore stands on the top of the screen. I am a newbie in servers and their configuration, in fact I am developer. I need a help from you guys.

    Read the article

  • ZFS, dedupe and PST files

    - by Unreason
    I am interested to know what would be expected maximum dedupe ratio for a set of PST files. I have ~40G of pst files from ~15 usres with high level of duplication of attachments. I am running tests to see if I can have significant space savings if I store the data on ZFS with dedupe. For this purpose I have installed a test setup of Nexenta, but was wondering if someone here had already done this and what level of deduplication I might expect (or in another words how sensitive are pst files to block alignment and what are the parameters that can influence the ratio?). Initial test show very low dedupe ratio and I did find explanation that block level dedupe would not be efficient here and that byte level dedupe would be much better (and that it should be performed by application that is aware of internal organization), so I am just double checking here if someone have some more input. Otherwise I will probably be converting PST files to IMAP.

    Read the article

  • How to know if my nginx is in good health?

    - by Howard
    I am running a nginx on EC2 (m1.small) for SSL termination. I am using 2 workers on Ubuntu, with latest nginx (stable), the network throughput is around 2Mbps and system load average is around 2 to 3. I am wondering if this system is in good health for now, e.g. what is the queue length (I know nginx can handle a lot of concurrent request, but I mean before the request is being served, how many of them need to wait before being served) what is the average queue time for a given request to be served. I want to know because if my nginx is cpu bounded (e.g. due to SSL), I will need to upgrade to a faster instance. My current nginx status Active connections: 4076 server accepts handled requests 90664283 90664283 104117012 Reading: 525 Writing: 81 Waiting: 3470

    Read the article

  • Accessing localhost on IIS7 from another computer on the network

    - by Adam
    I recently upgraded computers to Windows 7 Professional and am running IIS7. When I'm on my computer I can easily access localhost through my web browser but when I try from another computer on my network (replacing localhost with my computer name) it doesn't work. I also tried using "computername.domain.com" and still no luck. I can access other computers running Windows XP and IIS 5 but I'm having no luck accessing my own from another computer. I checked and my IIS7 has anonymous users enabled. Am I missing any other setting? Is this an IIS7 thing or am I missing a setting? Thanks in advance!

    Read the article

  • BIG IP - HTTPS Health Monitor setup

    - by djo
    I have a Web site that we have setup a health monitoring pages so we can take our servers in and out of the Big-IP as we see fit. Now we have just moved onto Big-IP and the issue I have hit is that you setup Health Monitors for port 80 and 443, now the 80 check works fine but when I to get the 443 check to look at our file it fails. Now I am aware as I am hitting the this page on the IP address over HTTPS is going to cause a cert error but I would have guessed that BIG-Ip would have been setup just to accept the cert and carry on with the check. Is what I am wanting to do possible? Also is there a way of just using a HTTP monitor for HTTPS? Because if port 80 has stopped sending traffic then if i use the same monitor for 443 it will stop traffic to that. Any help would be great! Thanks

    Read the article

  • print job doesn't go to print queue

    - by flatsguide
    I have two printers hooked up to my intel imac. I am having a printing error. It seems that whenever i try to print (a simple text doc) I am unable to get the print job to go to print queue with one of my printers. I have a HP c7280 and a HP c3100. I am able to get one working properly, but the other doesn't seem to allow it to go to the print queue. I have switched usb cables (with the printer that I know works) and both printers are recognized by the computer in the printer preferences pane. I've tried reseting the printer system in the printer setup utility.. reloaded the drivers from HP.. etc. If anyone has a suggestion or could point me to a little help i'd be VERY grateful Best Regards.. B

    Read the article

< Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >