Search Results

Search found 9935 results on 398 pages for 'pages'.

Page 233/398 | < Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >

  • Gmail - Ways to have more than 20 items per page in the search results

    - by Andrei
    Gmail has the notorious limitation of 20 results per page when searching your mail. Is there an extension (Chrome - preferable, Firefox, etc.) that can fix this (i.e. allow more than 20 items per page)? Based on my experience this should be entirely doable (have the extension move across pages in the background and then collect the results). Is there an extension that can already do this? I'm asking because I couldn't find one.

    Read the article

  • building a home server with a nas appliance [closed]

    - by user51666
    Possible Duplicate: Best way to build home NAS with redundancy I was hoping to get some ideas from folks here. I'm interested in building a home web server with a nas appliance. It would be primarily used for storing pictures, video. I want a networked storage device so I can have multiple devices access it wirelessly as needed from within our home and also I want the option to access from outside the house using a login/pw access. I'm also interested in customizing, building my own web pages as well. Preferably apache. Any preferences? Does anyone have an interesting, neat set up they can share? Thanks!

    Read the article

  • Registry Cleaner, useful or not

    - by garybo
    Hi, I’m constantly seeing Ad’s about Registry Cleaning. Each time I see one of those Ad’s I remember reading an article (don’t remember who wrote it, but it was posted on one of those geek chat pages) a few years ago about it not being necessary to clean a registry, in fact, the article continued, it and said sometimes it causes more harm than good to run a registry cleaner. I would like to hear your opinion about this, and if you think it is good to use one of these programs, could you recommend a few. Thanks in advance. garybo

    Read the article

  • Delete Google Chrome's tab history

    - by wizlog
    After browsing the web for a while, I want to delete my history. So I press CTRL and H keys, and click the Edit items on the blue bar at the top of the page. Then I select the checkboxes for the items I want to remove. Then I click remove selected items. When I go into any of my open tabs, their history isn't deleted. Without restarting the tab or the browser, is there any way to clear the history within a tab? I also want to know how to clear just one tab's history, without needing to know all the pages that the tab had visited, then going to the history page...

    Read the article

  • connected ethernet without disabling wireless, now I have *two* LAN ip's?

    - by peter karasev
    Ok I'm on ubuntu 11.04, not too knowledgeable about network stuff. Usually people ask things like "wired works but wireless does not!". In m y case, I'm just curious about what it means to have both of them seemingly connected. In 'ipconfig' in shell I see that I have 192.168.1.2 for the wireless AND 192.168.1.3 for the ethernet. What does this mean for applications, does one of the two get precedence? It seems like my pages load slightly faster, so perhaps the ethernet is being used, but I could be imagining the speedup...

    Read the article

  • IE8 Stopped Keeping History

    - by BillP3rd
    Like the title says, apparently my IE8 has stopped keeping the history of pages I've visited. I've searched SU and Google and can't find anything that seems to describe what I'm seeing. I have IE set to retain history for 999 days (the maximum allowed): As you can see below, apart from today and last Thursday, IE appears to be oblivious to any activity more recent than three weeks ago. Clicking on either "Thursday" or "Today" reveals no recorded history, however. Very odd behavior. Finally, the history does extend back 30 weeks to when I built the computer, and there is recorded history for every week. I'd appreciate suggestions. NB. Windows 7 Ultimate, x64 (but 32-bit IE8).

    Read the article

  • List all documents (webparts) and sites using a certain solution in sharepoint 2007

    - by tnolan
    I would like to uninstall a Sharepoint application template (GroupBoard Workspace to be exact) but I want to make sure nothing currently relies on it. I don't see any functions within stsadm that will tell me this information and I have even tried SPM which would work, but with such a huge site it's tedious to go through every single web and page to see which features are in use. Is there a way (probably with SQL using the id from stsadm -o enumsolutions) to list everything that relies on a template within a given solution, including webparts on custom pages? If this is not possible, what is the best way to check dependencies prior to uninstalling a solution (especially since GBW is not the only one on my axe list.) Note: I know that stsadm -o deletesolution will stop me from removing something that is in use, but I want to see all of the things that are using a given solution.

    Read the article

  • Web Server Setup

    - by gustyaquino
    Hello, In my workplace, we want to implement your own web server for at leat 100 Apache/PHP/MySQL web pages. My boss is opposed to hiring skilled personnel, he think we can do ourselves. Currently, we are working with hostgator reseller account. I chose CentOS as the operating system, but I don't know the best hardware solution. HP, Dell ? What about the setup on these platforms? Thanks. PS: sorry for my bad english Edit: The purpose of this migration isn't related to performance issues. But independence.

    Read the article

  • Apache deny access to images folder, but still able to display via <img> on site

    - by jeffery_the_wind
    I have an images folder on my site, let's call it /images/ where I keep a lot of images. I don't want anyone to have direct access to the images via the web, so I put a new directive in my Apache config that achieves this: <Directory "/var/www/images/"> Options Includes AllowOverride All Order allow,deny Deny from All </Directory> This is working, but it is blocking out ALL ACCESS, and I can't show the images anymore through my web pages. I guess this makes sense. So how do I selectively control access to these images? Basically I only want to display certain images through certain webpages and to certain users. What is best way to do this? Do I need to save the images to the database? Tim

    Read the article

  • Asking for brief explanation of reverse proxying and recommended software

    - by 80skeys
    I need to set up a reverse proxy where the backend web server is serving https pages only. I've never set up a reverse proxy and would like a brief overview of how it works. One of my questions is whether the proxy needs to run https also, or does simple http suffice? Second question would be whether to use Apache, varnish, nginx, or squid. This is for an internal site for a small company, so not a lot of traffic expected. Maybe a few dozen users each day.

    Read the article

  • How do I get my 192.168.* Linux server accessible via http://hostname/?

    - by rfrankel
    (Sorry if this question isn't worded well and/or is duplicate. I'm not a networking guy and I'm probably not using the right terms...this also makes it hard to see if this has already been answered.) I'm running a CentOS server in VirtualBox, Windows host, and I can see access Apache-hosted pages at http://192.168.1.109/ from machines on my LAN. But what I'd like is for people to be able to type http://hostname/ ...both because it's easier and primarily because I'm not sure that local IP is static. I'm not really sure how to proceed - could someone point me in the right direction? Thanks.

    Read the article

  • What is causing my wireless problems?

    - by user34629
    I have a Dell XPS M1330, since I moved to my new house, the wireless barely works on it, my housemates report no problems, so I guess it's something to do with my laptop. It's eratic in normal mode, sometimes not working for days, usually working on-and-off, forcing me to reload pages a few times before they work, etc. It works perfectly in safe mode, the driver I'm using is the Intel PRO/Wireless 3945ABG Network Connection, version 12.4.4.5, with a date at 26/10/2009, the most recent one I could find, bluetooth is not enabled, and I'm running Vista 32-bit Home Premium, which it shipped with. I also tried disabling IPv6 and resetting the IP stack, both to no avail. Can anybody help me?

    Read the article

  • how to protect php app (vbulletin) from hackers

    - by samsmith
    Our vBulletin system is under constant attack, raising cpu load and making the system very slow for legit users. The attack is a script type attack that is attempting to log in and/or create new login ids (mostly it is trying to create login ids in order to spam the site). In vBulletin, we have black listed large ranges of ips, which has helped a lot, but the attacks continue. Is there an automated way to protect the application or web server? ideally, the protection would detect the pages accessed and automatically black list the ip.

    Read the article

  • Method to control multiple sites using same cookie?

    - by Frost Shadow
    Is it possible for two different web pages to use the same cookie? For example, some news sites now have buttons on the bottom, where if you are logged into facebook, you can just click the button to "like" the article. Is this a case of a 3rd party website using facebook's cookie to know which account you are, and if so, is there a way I can control it? I'm not sure how the new "like" system works, so maybe the button part isn't actually on the news site, but hosted on facebooks servers or something, so it's really facebook itself accessing its own cookie. If that's the case, is there a way I can choose when a site accesses its own cookie? Thanks for any help!

    Read the article

  • Recommendations for SSD for server and database use?

    - by Tony_Henrich
    SSDs are a new technology and they are constantly improving. A lot of the posts here were posted in 2009 when SSDs where less mature and not as fast. What was recommend back then is probably out of date today because of better options. The SSD is used to hold SQL Server databases. Size is probably 128G. The database is used with a CMS and web server so web pages need to get their data and render as fast as possible. Which modern SSD is recommended for such a use? Is there an SSD better than Intel X-25 E/M in terms of performance/cost? (I am also evaluating cost between : RAM + UPS (semi persistent) vs SSD for same amount of gigabytes. No RAID is involved)

    Read the article

  • Spare PC with XP to be used as Torrent Downloader and local Web Server HOWTO?

    - by gslide
    Hi I'm a bit in a pickle in trying to setup my old laptop using Windows XP to be able to serve as two devices in one, I want to make it a downloader for torrents and a local web server as well and how to do this? I have a wireless NIC and LAN, and I have two internet connections and i would like to be able to download torrent only on LAN and be a webserver on the Wireless, also the webserver can be accessed through the internet. The reason for trying to separate the connection is I can't have torrent downloads using all my bandwidth as my web pages cant be access as it times out or too slow. I have two broadband connections, is this even possible or would i need a different OS or program that I can download? please

    Read the article

  • I do not understand -printf script

    - by jerzdevs
    I have taken over the responsibility of RHLE5 scripting and I've not had any training in this platform or BASH scripting. There's a script that has multiple pieces to it and I will ask only about the second piece but also show you the first, I think it will help with my question below. The first part of the script shows the output of users on a particular server: cut -d : -f 1 /etc/passwd The output will look something like: root bin joe rob other... The second script requires me to fill in each of the accounts listed from the above script and run. From what I can gather, and from my search on the man pages and other web searches, it goes out and finds the group owner of a file or directory and obviously sorts and picks out just unique records but not really sure - so that's my question, what does the below script really do? (The funny thing is, is that if I plug in each name from the output above, I'll sometimes receive a "cannot find username blah, blah, blah" message.) find username -printf %G | sort | uniq

    Read the article

  • Lighttpd based server issues crop up when port forwarding

    - by michael
    I have four host computers running lighttpd webservers. they are sitting behind a hspa modem, which each occupying a http port between [81 - 84]. 80 is taken by the modem itself. The port forwarding is setup correctly, however, only a portion of any webpage I request from any of the hosts comes through (they all fails after %20 of the page). If I put the host on port 81 into the dmz, it serves pages fine. The others do not respond to the dmz treatment. Is it possible the web content on the hosts somehow require ports aside from their respective http port? Or is it possible that even though the server.port in the lighttpd_ssl.conf file is set, the individual hosts are still expecting to serve on port 80? I am not familiar with lighttpd, nor did i set them up. they are running on video encoders i purchased. I can grab any files from them required for further information on the problem.

    Read the article

  • Run script when Varnish starts

    - by kipusoep
    I'd like to run a script when Varnish starts. This script should execute a webrequest to a webserver (its backend), which then makes sure Varnish's cache gets filled with all pages residing on this webserver. So this script makes sure everyting is in Varnish's cache when Varnish (re)starts, because we're using Varnish as cache and fail-over (the webserver should be able to be down for let's say a week for example, without any consequences). What are the possibilities to do this? We can't just edit /etc/init.d/varnish and /usr/sbin/varnishd because they can het overwritten when updating varnish? Thanks!

    Read the article

  • Show internal website through iframe

    - by tommasop
    Hi guys, I have a public website with an iframe pointing to a private website (only visible inside my company's lan). I'd like that the iframed pages could also be visible from the outside. Is it possible to achieve? My public server is a Windows Server 2003 with IIS 6 and the server can browse easily to the private server webpage. My private server is an Ubuntu 8.04 machine. I tried with an IIS virtual directory redirection but it's not working.

    Read the article

  • Mix content warning on ASPX page

    - by Amit
    Hi, We have started receiving the mixed content warning on ASPX pages on our secured site. We do not have any mix content, we load all our JS, Images, CSS and ASPX files using HTTPS. I dont know why we have started receiving these warnings now. The latest thing which we have added is the third party control for Dialog boxes from Essential Object. We are previously using their Menu control but added dialog box recently. Also we have made our application browser compatible. I feel the reason is something between these two points. Can anyone suggest any solution or any workaround if they know any or have used Essential Object controls and faced simililar issue? Essential object is saying it is not their problem. The mix content warning appears any time and not specifically when the Essential Control dialog box popsup, thats why I am bit confused. Any help is highly appriciated. Thanks.

    Read the article

  • Does searching a keyword on Google make the crawlers look harder in the future?

    - by Foo Bar
    Do the search requests made by the users influence the Google crawlers "attraction" by this keyword? Let's say Google has some hits on a specific keyword in the search index. And now I search for exactly this keyword. Will the Google crawlers react to the search and keep looking more intense for pages that could match this keyword? A reason why this could be important: Privacy when searching yourself. Assume you just want to know how much Google (and thus other people) can find out about you. If now any (statistical) additional search for your name trigger the crawlers even one step harder to find even more about you, it would have the negative effect that you would actually be found easier in the future, even though you had the intention and hope to find out how few Google finds about you. It's a bit like the dillema in quantum mechanis: Does observing the system automatically change the system?

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Apply rewrite rule for all but all the files (recursive) in a subdirectory?

    - by user784637
    I have an .htaccess file in the root of the website that looks like this RewriteRule ^some-blog-post-title/ http://website/read/flowers/a-new-title-for-this-post/ [R=301,L] RewriteRule ^some-blog-post-title2/ http://website/read/flowers/a-new-title-for-this-post2/ [R=301,L] <IfModule mod_rewrite.c> RewriteEngine On ## Redirects for all pages except for files in wp-content to website/read RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !/wp-content RewriteRule ^(.*)$ http://website/read/$1 [L,QSA] #RewriteRule ^http://website/read [R=301,L] RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> My intent is to redirect people to the new blog post location if they propose one of those special blog posts. If that's not the case then they should be redirected to http://website.com/read. Nothing from http://website.com/wp-content/* should be redirected. So far conditions 1 and 3 are being met. How can I meet condition 2?

    Read the article

  • Setting correct Content-Type sent from Wordpress, on Apache server

    - by eoinoc
    I need help pointing me in the right direction for setting the ContentType returned by Apache for content produced by WordPress. I'm having trouble figuring out why WordPress is returning incorrect headers. Issue The specific problem is that our Wordpress blog pages are being downloaded as a file rather than displayed by Internet Explorer and Chrome v21. Content-Type: application/x-gzip is being returned by the server. I'm told that I should expect Content-Type: text/html. Background The URL is http://www.bitesizeirishgaelic.com/blog/.

    Read the article

< Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >