Search Results

Search found 30111 results on 1205 pages for 'best practices analyzer'.

Page 176/1205 | < Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >

  • Best quality/price shared Web Hosting

    - by embedded
    I'm looking for a web hosting to my iPhone app. My needs are as following: * PHP5 * MySQL5 * curl * shared SSL * CRON * Fast support * Money back What do you think about those 2: IX Web Hosting and HostGator? Do you recommend working with one of them? I appreciate any advice. Thanks

    Read the article

  • Stack Overflow works best with JavaScript enabled banner

    - by carpenter
    I am trying to mimic this site's Javascipt required banner, and have the below divs which are being hidden if javascript is allowed/enabled, but I am getting a flash/glimps of it on page load. <div id="Main_noJS">Craftystuff.com works best with JavaScript enabled</div> <div id="PartOfMain_noJS"><br /></div> CSS: #Main_noJS { width: 100%; height: 23px; font-family: Arial; font-size: 111%; color: White; font-weight: bold; background: #AE0000; text-align: center; padding-top: 4px; position: fixed; z-index: 100; } JavaScript: // hide the "Craftystuff.com works best with JavaScript enabled" banner, if JavaScript is working if ($("#Main_noJS")) { $("#Main_noJS").hide(); // hide the spacer between the main content and banner... $("#PartOfMain_noJS").hide(); } So the banner is visible to start with, and only when javascript is enabled do I hide it but javascript must take a second to get to work to hide things... I would like to try to stop the glimps of the banner, when the page first loads, any help?

    Read the article

  • Best way to handle PHP sessions across Apache vhost wildcard domains

    - by joshholat
    I'm currently running a site that allows users to use custom domains (i.e. so instead of mysite.com/myaccount, they could have myaccount.com). They just change the A record of their domain and we then use a wildcard vhost on Apache to catch the requests from the custom domains. The setup is basically as seen below. The first vhost catches the mysite.com/myaccount requests and the second would be used for myaccount.com. As you can see, they have the exact same path and php cookie_domain. I've noticed some weird behavior surrounding the line below "#The line below me". When active, the custom domains get a new session_id every page load (that isn't the same as the non-custom domain session). However, when I comment that line out, the user keeps the same session_id on each page load, but that session_id is not the same as the one they'd see on a non-custom domain site either despite being completely on the same server. There is a sort of "hack" workaround involving redirecting the user to mysite.com/myaccount, getting the session ID, redirecting back to myaccount.com, and then using that ID on the myaccount.com. But that can get kind of messy (i.e. if the user logs out of mysite.com/myaccount, how does myaccount.com know?). For what it's worth, I'm using a database to manage the sessions (i.e. so there's no issues with being on different servers, etc, but that's irrelevant since we only use one server to handle all requests currently anyways). I'm fairly certain it is related to some sort of CSRF browser protection thing, but shouldn't it be smart enough to know it's on the same server? Note: These are subdomains, they're separate domains entirely (but on the same server). <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName mysite.local ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost> #Wildcard (custom domain) vhost <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName default ServerAlias * ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" # The line below me php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost>

    Read the article

  • What is the best software for desktop recording?

    - by Ivo Flipse
    I know this question is partially a stub from: http://superuser.com/questions/201/free-desktop-recording-screencasting-on-windows However I would have to use this at work, so it doesn't have to be free, just work very well. Lended from the free post: I would expect such an application to have these features: choose the whole desktop, a region, a window to record zoom in an area, and camera move save the recorded movie to a compressed format have basic editing tools have the mouse highlighting feature highlight window/field on the screen, any trick will do display pressed keys/key combinations (like iphone does) So the question is: Which desktop recording/screencasting software would you recommend?

    Read the article

  • Best format for backing up data in Blu Ray

    - by Arrieta
    We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray. Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc? Do you have any other approach for creating physical back-up of your hard drives?

    Read the article

  • What is the best nginx compression gzip level?

    - by Chamnap
    I'm using nginx reverse proxy cache with gzip enabled. However, I got some problems from android applications http requests to my rails json web service. It seems when I turn off reverse proxy cache, it works ok because the response header comes without gzip. Therefore, I think the problem caused from gzip. What is the most appropriate level of gzip compression? gzip on; gzip_http_version 1.0; gzip_vary on; gzip_comp_level 6; gzip_proxied any; gzip_types text/plain text/css text/javascript application/javascript application/json application/x-javascript text/xml application/xml application/xml+rss;

    Read the article

  • Best user portal?

    - by John
    Hi guys, Anyone recommend a good user portal, looking to create a portal for users to provide them some self help material. Has to be REALLY simple! Thanks John

    Read the article

  • best data+partition recovery software

    - by Pennf0lio
    I accidentally formatted my Drive D that contained all my Backups and Documents. I separated my files to my Drive D hoping I will not harm my files. Since I use Acronis Recovery to Install a new OS with some pre-installed application to my HDD I didn't realize I also formated/erase my Drive D. Now my drive D is unpartitioned. I am really in really in deep trouble and would need some urgent help, Please recommend a Software that at least can restore my Old Drive that contained my files. I'm assuming most of you think this is a duplicate of some old questions here, But I'm not looking for data recovery, I need to recover the whole partition with the files. I used to use "Recuva" but It only recovers files not the whole folders with the files in it. Please advice. Thank You!

    Read the article

  • What's the best self-tracking software for Linux?

    - by trench
    I'm looking for a way to track myself and receive quality data upon which I can write future scripts/programs. For example, I use Google Reader a lot. I'd like to track the hrefs that garner my clicks. Further, I'd like to drop all of the words of each href into a database where they can be stacked in a hierarchical manner. At the end of the week I want to know that "Ubuntu" garnered 448 clicks and "Cheetos" garnered 2. :) That's just one example... I'd like this tracking and data-collecting to extend beyond my browser. I know writing something to do this myself wouldn't be too awfully difficult but if something already exists I'd happily use it. Thanks in advance. Primary OS: Ubuntu 10.04

    Read the article

  • What is the best way to extend restful_authentication/AuthLogic to support lazy logins by an anonymo

    - by Kevin Elliott
    I'm building an iPhone application that talks to a Ruby on Rails backend. The Ruby on Rails application will also service web users. The restful_authentication plugin is an excellent way to provide quick and customizable user authentication. However, I would like users of the iPhone application to have an account created automatically by the phone's unique identifier ([[UIDevice device] uniqueIdentifier]) stored in a new column. Later, when users are ready to create a username/password, the account will be updated to contain the username and password, leaving the iPhone unique identifier intact. Users should not be able to access the website until they've setup their username/password. They can however, use the iPhone application, since the application can authenticate itself using it's identifier. What is the best way to modify restful_authentication to do this? Create a plugin? Or modify the generated code? What about alternative frameworks, such as AuthLogic. What is the best way to allow iPhones to get a generated auth token locked to their UUID's, but then let the user create a username/password later?

    Read the article

  • Best way to keep configuration for server reinstallation?

    - by Gunnar
    I have a server at home running Ubuntu 12.04 which has grown messy over the years. I have fiddled with various packages, desktop environments (for VNC) etc. and I would like to reinstall it to start again, and have better control over what goes into the box. But I want to keep much of the configurations after reinstallation, like LVM configuration, apache2, samba, etc etc. There would ideally exist a program which could analyze /etc, installed packages and such, store the information, and selectively put it back into the new installation. I am even considering installing Ubuntu server on a virtual machine, just to be able to compare the contents of /etc with a clean installation, and even perform a migration to the virtual machine first, to verify that the transfer process works. How do one go about performing this kind of reinstallation? Have anyone seen any resources on the net on the topic?

    Read the article

  • Best way to upscale a video?

    - by Josh
    If I have a video file at 320x240 resolution which I want to re-encode (because I don't like the encoding it's in now) and I also want to play it at double size (640x480), will I get higher quality if I scale it up to 640x480 when I convert it to a new format, verses keeping it at 320x240 in the new format and playing it at double size? This probably depends on the program used to convert, and if so, please let me know any program which might increase the quality. Here's my thinking. If I play a 320x240 file at double size, the system has to scale up each frame in real time, whereas if I scale up while recompressing the system may be able to use a more intensive algorythm like Bicubic interpolation . However I am not sure if this is true or not.

    Read the article

  • Best way to clean the Apple Mighty Mouse?

    - by lostInTransit
    As much as I love Apple products' designs, I still don't like the mighty mouse. The scrolling keeps stopping in between very frequently. I tried the instructions provided on Apple's site and it does make the scrolling smooth but only momentarily. Isn't there any way to open it up and clean like a normal mouse? Or any other way to clean it better?

    Read the article

  • Best tools for "ssh tail -f" style log file monitoring and analysis

    - by dougnukem
    I'm looking for a tool to monitor custom PHP Error logs/Apache and possibly Java logs on remote development servers. I'm not looking for a full production log system like Splunk, but something that's a little more flexible than an ssh terminal doing a "tail -f". Perhaps something that will: * Monitor multiple log files to my local machine for searching/analysis later * Allow "alerts" when certain strings appear in the log * Provide some kind of tabbed/dashboard view of the multiple logs being monitored (in total less than 10 logs).

    Read the article

  • Best practice Raid groups for EqualLogic PS6510X

    - by 20th Century Boy
    We are thinking about purchasing 4 x EqualLogic PS6510X SANs (the Sumo boxes). Each has 48 x 600GB 10k SAS drives. They will be stacked to form a logical pool of storage (all in the same location). I understand that when you create a RAID group its done on a "per box" basis. So one box could be Raid 50, another Raid 10 etc. My question is, should I make one box a "performance" box ie Raid 10, and the other boxes "standard" ie Raid50? How do people configure their EQL arrays in the real world?

    Read the article

  • Best way to grow Linux software RAID 1 to RAID 10

    - by Hans Malherbe
    mdadm does not seem to support growing an array from level 1 to level 10. I have two disks in RAID 1. I want to add two new disks and convert the array to a four disk RAID 10 array. My current strategy: Make good backup. Create a degraded 4 disk RAID 10 array with two missing disks. rsync the RAID 1 array with the RAID 10 array. fail and remove one disk from the RAID 1 array. Add the available disk to the RAID 10 array and wait for resynch to complete. Destroy the RAID 1 array and add the last disk to the RAID 10 array. The problem is the lack of redundancy at step 5. Is there a better way?

    Read the article

  • How to best tune my SAN/Initiators for best performance?

    - by Disco
    Recent owner of a Dell PowerVault MD3600i i'm experiencing some weird results. I have a dedicated 24x 10GbE Switch (PowerConnect 8024), setup to jumbo frames 9K. The MD3600 has 2 RAID controllers, each has 2x 10GbE ethernet nics. There's nothing else on the switch; one VLAN for SAN traffic. Here's my multipath.conf defaults { udev_dir /dev polling_interval 5 selector "round-robin 0" path_grouping_policy multibus getuid_callout "/sbin/scsi_id -g -u -s /block/%n" prio_callout none path_checker readsector0 rr_min_io 100 max_fds 8192 rr_weight priorities failback immediate no_path_retry fail user_friendly_names yes # prio rdac } blacklist { device { vendor "*" product "Universal Xport" } # devnode "^sd[a-z]" } devices { device { vendor "DELL" product "MD36xxi" path_grouping_policy group_by_prio prio rdac # polling_interval 5 path_checker rdac path_selector "round-robin 0" hardware_handler "1 rdac" failback immediate features "2 pg_init_retries 50" no_path_retry 30 rr_min_io 100 prio_callout "/sbin/mpath_prio_rdac /dev/%n" } } And iscsid.conf : node.startup = automatic node.session.timeo.replacement_timeout = 15 node.conn[0].timeo.login_timeout = 15 node.conn[0].timeo.logout_timeout = 15 node.conn[0].timeo.noop_out_interval = 5 node.conn[0].timeo.noop_out_timeout = 10 node.session.iscsi.InitialR2T = No node.session.iscsi.ImmediateData = Yes node.session.iscsi.FirstBurstLength = 262144 node.session.iscsi.MaxBurstLength = 16776192 node.conn[0].iscsi.MaxRecvDataSegmentLength = 262144 After my tests; i can barely come to 200 Mb/s read/write. Should I expect more than that ? Providing it has dual 10 GbE my thoughts where to come around the 400 Mb/s. Any ideas ? Guidelines ? Troubleshooting tips ?

    Read the article

  • Best method of transferring files over internet?

    - by EsotericHabit
    I have a seedbox (running Ubuntu 9.10) at my (parent's) house and will be leaving it there once I go to college this fall. Currently I'm using samba to transfer files between computers, but I was wondering if once I am on my university's network, whether using FTP would be a better option versus samba over a VPN. The files will range from 100 MB to 17 GB, if that matters. Would one be more efficient over the other? Did I forget any other options?

    Read the article

  • Best "Keep your music with you" setup

    - by solomongaby
    I really enjoy listening to music, and because of this i have a lot of sources for it. This is not always a good thing since i have my music stored on a lot of devices ( home computer, work computer, online storage, ipod ) and sometimes its a bit difficult to have the same music in all places. Sometimes i dont have the same songs in all the places, i make a playlist in one place and want it in the other place etc. How do you keep your music in sync on all your devices. PS: i use both windows and linux.

    Read the article

  • The best software for users internet usage

    - by nikospkrk
    Hi, We are a small business using a Vigor 2820 as the internet router, and we'd like to install a software that could report any internet usage from our users. I already tried the "official" software made by Draytek called "SmartMonitor", but is reliability is a real issue as it doesn't seem to keep capturing packets after working 3 to 6hours (randomly), whereas Wireshark keeps capturing packets after that amount of time. As I'm really fed up with this tool, I'm looking for other solutions but I still want the same features: users statistics, websites ranking, users traffic, ... I already enabled the port mirroring feature, so it would be perfect if you could suggest me a port mirroring-based software (ideally freeware). I thought I had found the good one with Etherscout, but it just doesn't launch. I am even open to a tool that would "just" make some reports based on Wireshark captured files (*.pcap). Thank you for any of your suggestion, Nicolas.

    Read the article

< Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >