Search Results

Search found 33834 results on 1354 pages for 'site column'.

Page 320/1354 | < Previous Page | 316 317 318 319 320 321 322 323 324 325 326 327  | Next Page >

  • HTG Explains: What Is RSS and How Can I Benefit From Using It?

    - by Jason Fitzpatrick
    If you’re trying to keep up with news and content on multiple web sites, you’re faced with the never ending task of visiting those sites to check for new content. Read on to learn about RSS and how it can deliver the content right to your digital doorstep. In many ways, content on the internet is beautifully linked together and accessible, but despite the interconnectivity of it all we still frequently find ourselves visiting this site, then that site, then another site, all in an effort to check for updates and get the content we want. That’s not particular efficient and there’s a much better way to go about it. Imagine if you will a simple hypothetical situation. You’re a fan of a web comic, a few tech sites, an infrequently updated but excellent blog about an obscure music genre you’re a fan of, and you like to keep an eye on announcements from your favorite video game vendor. If you rely on manually visiting all those sites—and, let’s be honest, our hypothetical example has a scant half-dozen sites while the average person would have many, many, more—then you’re either going to be wasting a lot of time checking the sites every day for new content or you’re going to be missing out on content as you either forget to visit the sites or find the content after it’s not as useful or relevant to you. RSS can break you free from that cycle of either over-checking or under-finding content by delivering the content to you as it is published. Let’s take a look at what RSS is how it can help. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • What are the requirements to test a website using jquery.get() ? [migrated]

    - by Frankie
    I am working on a simple website. It has to search quite a few text files in different sub-folders. The rest of the page uses jquery, so I would like to use it for this also. The function I am looking at is .get() for downloading the files. So my main question is, can I test this on my local computer (Ubuntu Linux) or do I have to have it uploaded to a server? Also, if there's a better way to go about this, that would be nice to know. However, I'm more worried about getting it working. Thanks, Frankie PS: Heres the JS/jQuery code for downloading the files to an array. g_lists = new Array(); $(":checkbox").each(function(i){ if ($(this).attr("name") != "0") { var path = "../" + $(this).attr("name") + ".txt"; $("#bot").append("<br />" + path); // debug $.get(path, function(data){ g_lists[i] = data; $("#bot").html(data); }); } else { g_lists[i] = ""; } }); Edit: Just a note about the path variable. I think it's correct, but I'm not 100% sure. I'm new to web development. Here's some examples it produces and the directory tree of the site. Maybe it will help, can't hurt. . +-- include ¦   +-- jquery.js ¦   +-- load.js +-- index.xhtml +-- style.css +-- txt    +-- Scripting_Tools    +-- Editors.txt    +-- Other.txt Examples of path: ../txt/Scripting_Tools/Editors.txt ../txt/Scripting_Tools/Other.txt Well I'm a new user, so I can't "answer" my own question, so I'll just post it here: After asking for help on a IRC chat channel specific to jQuery, I was told I could use this on a local host. To do this I installed Apache web server, and copied my site into it's directory. More information on setting it up can be found here: http://www.howtoforge.com/ubuntu_debian_lamp_server Then to run the site I navigated my browser to "localhost" and everything works.

    Read the article

  • ERROR: 6553609 You are not authorized to perform the current operation

    - by Tim
    I'm trying to backup a sub site in my protal using Smigrate. When logged into the server and running the following command: D:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\60\BINsmigrate -w -f C:\CH_Test_04_backup.fwp -u domain\user -pw ** 20 Nov 2009 13:57:06 -0000 Site collection opened 20 Nov 2009 13:57:07 -0000 Authenticating against the web server. 20 Nov 2009 13:57:07 -0000 Already authenticated against the web server. 20 Nov 2009 13:57:07 -0000 ERROR (possibly fatal): ERROR: 6553609 You are not authorized to perform the current operation. 20 Nov 2009 13:57:07 -0000 Site collection closed I get an error (ERROR: 6553609 You are not authorized to perform the current operation) and the sites does not get backed up. I've tried solutions oulined in various KB articles but had no luck with them. Can anyone help? Regards Tim

    Read the article

  • What Should I Do? [closed]

    - by Laxmidi
    What is a reasonable goal in terms of traffic for my Flex 3 site: www.brainpinata.com Since I began a couple of months ago, I've gotten roughly 5500 ad views and 280 ad clicks. And the ad revenue is a whopping $4.80. (I don't use Google Adsense). I advertise my site using Google Adwords to try to build traffic. My budget is $10/day. What should I do? a) Push the marketing. Add a blog. Try to get backlinks, contact blogs, start a Facebook page, tweet, etc. b) Google is only indexing the static content in the SWF. The questions/answers are pulled from a mySQL database. So, Google doesn't index 99% of the content. Should I re-do the site in HTML/Javascript and hard-code the questions for each puzzle? (This would be a challenge as I don't know javascript worth squat.) Or should I hard-code the questions in XML and put them in the Flex app? If I put the questions in an XML file it's roughly 500 KB. Other ideas? c) Should I switch ad networks? (I currently get about 100 visitors a day). My ad network pays so little that if I were to make even $500/month, I would need 550,000 ad views/month, which seems impossible. If I go ahead and switch ad networks, I need to find one that allows iFrames as I've got a Flex website. Which ad networks permit their ads to be shown in iFrames? d) Should I cut and run? I put a lot of work into this project and it would really stink to get nothing out of it. I'm looking for some good advice. Looking forward to your suggestions. Thank you. -Laxmidi

    Read the article

  • First Project a big one, How much should we charge?

    - by confuzzled
    Two of my cousins and I started a freelance computer repair/web design business just to make some money on the side during college, and received our first major web design project about three weeks ago. Now we've created websites before, but it was mostly for family businesses and have never really charged money, and most of the websites have been static, and don't really require a CMS. This project, however, was a big one (for us anyways). We created a news site that had several categories, we created the banners, we created a classifieds page (not a web app just something static that they control). Several links, a few graphical assets, CSS drop down menu, RSS feed from a different news site, weather, all the normal stuff you would find on a regular news site. On top of that we put in all the usual Joomla stuff (search, Jcomments, Jslide pictures, JCE, etc.). Then we uploaded the first 10 articles they gave us, and we are going to train them how to use Joomla. Now, at first we decided for 700 dollars. I assumed they just wanted a simple blog like website where they can upload articles. But then we had a meeting, and they asked for a lot more. Note: we did not hard code the template from scratch, but customized the gantry framework to fit their needs. We did code quite a bit however. I estimate that we put in about 50-60 hours in total. I'm wondering if 700 dollars is a bit low, this price is definitely not set in stone. Please keep in mind that this is our first project, and we are newbies, please be kind. Thank You!

    Read the article

  • 12.10: How to hide bind mounts in nautilus?

    - by Bazon
    Summary: How do I remove folders mounted via bind or bindfs in /etc/fstab from appearing as devices in nautilus left column, the "places" view? detailed: Hello, I mount various directories from my data partition via bind in /etc/fstab in my home directory, eg like this: #using bind: /mnt/sda5/bazon/Musik /home/Bazon/Musik none bind,user 0 0 #or using bindfs bindfs#/mnt/sda5/tobi/Downloads /home/tobi/Downloads fuse user 0 0 (Background: /dev/sda5 mounted to /mnt/sda5 is my old home partition, but I do not want to mount it as a home partition, as I always have at least 2 Linuxes on the computer ...) That works well, but since 12.10 every one of those items is listed in Nautilus in the left column under "Devices". (Where normally USB drives appear, etc.) This is a waste of space (as I have many of such mounts...) and so I would like to have these mounts hidden, just as it was before in 12.04. How can I do that? Thanks!

    Read the article

  • Setup IIS 7 as FTP Server that is connectable outside of my local network

    - by Usta
    I was able to setup an FTP site that I was able to access via ftp://127.0.0.1/ or my local(static) ip. To do this I followed these instructions (with the exception that I did not bind to 127.0.0.1 as suggested) http://learn.iis.net/page.aspx/301/creating-a-new-ftp-site-in-iis-7/ I have created a firewall exception for port 20 and 21, and setup port-forwarding on my wireless router. But I can only access the site via local-host, and I need to have a friend have read access to it. So how do I enable remote access to it? (I'd rather not purchase a domain-name) My setup: IIS 7.5 Windows 7 Professional Wireless Network Norton Internet Security 2012 An Internal Static IP Address

    Read the article

  • How to hide bind mounts in nautilus?

    - by Bazon
    Summary: How do I remove folders mounted via bind or bindfs in /etc/fstab from appearing as devices in nautilus left column, the "places" view? detailed: Hello, I mount various directories from my data partition via bind in /etc/fstab in my home directory, eg like this: #using bind: /mnt/sda5/bazon/Musik /home/Bazon/Musik none bind,user 0 0 #or using bindfs bindfs#/mnt/sda5/tobi/Downloads /home/tobi/Downloads fuse user 0 0 (Background: /dev/sda5 mounted to /mnt/sda5 is my old home partition, but I do not want to mount it as a home partition, as I always have at least 2 Linuxes on the computer ...) That works well, but since 12.10 every one of those items is listed in Nautilus in the left column under "Devices". (Where normally USB drives appear, etc.) This is a waste of space (as I have many of such mounts...) and so I would like to have these mounts hidden, just as it was before in 12.04. How can I do that? Thanks!

    Read the article

  • Need to Determine the Engine Status?

    - by user702295
    If you need to establish the status of the engine, begin with this SQL: select status, engine, engine_version,fore_column_name from dm.forecast_history The status of an engine run is stored in the FORECAST_HISTORY table, in the “status” field.  We can also find in that table the FORE_COLUMN_NAME field. This field includes the name of the column in SALES_DATA in which the relevant forecast is stored. Here are the possible statuses: -1, -2 : The engine failed in the initialization phase.  Which means, before the engine manager created the engines.  0 : The engine stopped in the optimization phase.  Which means, after the engines were created.  1: The engine finished the run successfully.  2: Forecast was never calculated for the relevant column that is mentioned in FORE_COLUMN_NAME.  

    Read the article

  • Can I use a 302 redirect to serve up static content from a url with escaped_fragment?

    - by Starfs
    We would like to serve up seo-friendly ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the htaccess file, that takes the '?_escaped_fragment=' string and send that to a static page? For example /snapshot/yourfilename/ How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ url to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • How can I move mysites to a new location

    - by Bob
    I recently restored my content and was instructed to create mysites in a different location than was originally used. Now I have several users mysites in /personal. The new desired location is /mysites. From what I found in the documentation I should back them up and restore them to the new location. Here's what I've done: Backup individual site collection for user mysite stsadm -o backup -url "https://myUrl/personal/john_smith" -filename johnsmith.bkup Restore individual site collection for user mysite stsadm -o restore -url "https://myUrl/mysites/john_smith" -filename johnsmith.bkup -overwrite The result of this and the problem is when i enumerate sites i end up with this: <Site Url="https://myUrl/mysites" Owner="domainname\john.smith" ContentDatabase="WSS_Content_MySites" StorageUsedMB="1.6" StorageWarningMB="90000" StorageMaxMB="100000" /> it leaves off the username part of the url. and if I restore more that one they want to overwrite each other.

    Read the article

  • Best Architecture for ASP.NET WebForms Application

    - by stack man
    I have written an ASP.NET WebForms portal for a client. The project has kind of evolved rather than being properly planned and structured from the beginning. Consequently, all the code is mashed together within the same project and without any layers. The client is now happy with the functionality, so I would like to refactor the code such that I will be confident about releasing the project. As there seems to be many differing ways to design the architecture, I would like some opinions about the best approach to take. FUNCTIONALITY The portal allows administrators to configure HTML templates. Other associated "partners" will be able to display these templates by adding IFrame code to their site. Within these templates, customers can register and purchase products. An API has been implemented using WCF allowing external companies to interface with the system also. An Admin section allows Administrators to configure various functionality and view reports for each partner. The system sends out invoices and email notifications to customers. CURRENT ARCHITECTURE It is currently using EF4 to read/write to the database. The EF objects are used directly within the aspx files. This has facilitated rapid development while I have been writing the site but it is probably unacceptable to keep it like that as it is tightly coupling the db with the UI. Specific business logic has been added to partial classes of the EF objects. QUESTIONS The goal of refactoring will be to make the site scalable, easily maintainable and secure. 1) What kind of architecture would be best for this? Please describe what should be in each layer, whether I should use DTO's / POCO / Active Record pattern etc. 2) Is there a robust way to auto-generate DTO's / BOs so that any future enhancements will be simple to implement despite the extra layers? 3) Would it be beneficial to convert the project from WebForms to MVC?

    Read the article

  • Possibly hacked delicious account suddenly containing links to adult sites etc - is there an alternative for delicious

    - by Homunculus Reticulli
    I was searching for some academic publications I saved earlier on delicious. To my astonishment, there are a lot of tags and links in my account, that point to adult material from a Brazillian site (Portugeuse or Spanish), and lots of links to European sports site (in German or Dutch). Most of these links are public, and it seems my account has been hacked. I searched the delicious site to see if I could report the issue, and ofcourse there is no contact details. I've had just about enough with delicious (first, they changed their login system after being taken over - resulting in me losing over a year's worth of bookmarks), now this. Is there a replacement online bookmarking service someone can recommend?

    Read the article

  • How do I keep controversy in check?

    - by Aaron Digulla
    This is probably OT but it's less OT here than on any other SO site, so please bear with me. I'm working on a new project votEm. The goal is to give independent candidates a platform to introduce themselves to get elected for a political office. My main reason is that today, it's too expensive to run for an office. Some politicians in the US spend as much as 30 million dollars (!) for a single campaign. That money is better spent elsewhere. In a similar fashion, people who want to change countries like Egypt, could use such a platform to present themselves. Now I expect a lot of emotions and pressure on my site. People with a lot of money (and a lot to lose) will try to game it (political parties, secret services of ... errr ... "not 100% democratic countries", big companies, ...) To avoid as many mistakes as possible, I need a list of resources, ideas and tips how to keep such a site out of too much trouble. PS: I'd make this CW but the option seems to be gone...

    Read the article

  • Is SCCM overkill for medium-sized organizations?

    - by Le_Quack
    I am an IT technician in a high school with around 1600 students 250 staff and 800+ client computers mostly running Windows 7. Our team is composed of three members. My boss seems content with a network that works (just about) not necessarily a productive well maintained network that is easy to run and maintain. I'm still fairly early on in my I.T. career so I'm not up to speed on all the different endpoint management solutions that are available. I'm looking for a better way to manage clients (deploy software, track changes, inventory etc) I like the look of SCCM 2012's features but the case studies seem to be aimed at large multi-site infrastructural rather than a single mid sized site. Is SCCM suitable for a mid sized single site or is it aimed at much larger corporations? How can I determine whether or not an endpoint management solution like SCCM is a good fit for our organization? EDIT: Thanks for all the help I'll take a look at SCE and SCCM and get some proposals drawn up to take to my boss/deputy head

    Read the article

  • Website loading until initial script finishes

    - by wardy277
    Hi, i have a highly used server (running plesk). I have some long scripts that take a while to process (huge mysql database). I have found then in 1 browser, i run the script and while it is loading i cannot view any other parts of the site until the script finishes, it seems that all the requests go off, but they don't get served until the initial script finishes. i thought this may be a server wide issue, but it is not. If i use another computer i can view the site fine, even on the same computer with a different browser i can navigate fine, while the script still loads. I think it much limit the number of requests per session. Is this correct? is there any way to configure this to allow for 2-3 other requests per session? It is really bad that when i am on the phone to a client, i have just run a long report, but cannot use the site or follow what they are saying until the page has loaded? Chris

    Read the article

  • Wordpress blog penalized by Google search - what's wrong?

    - by pawelbrodzinski
    I have a blog (http://blog.brodzinski.com), which is wordpress.org blog with pretty popular Thesis theme with almost no other customizations. Some time ago it was penalized by Google search - it simply stopped appearing in search results even for search terms it used to be top result, like my name - Pawel Brodzinski - which isn't anything close to popular search term. To be exact the site has been penalized on Nov 18. It started popping up in search result on Dec 23 but only for a few days. Since Dec 27 it is out again. I know Google guidelines and I'm not aware to break any of them. I submitted reconsideration request after I noticed penalty. It was proceeded and there was no change whatsoever (no surprise as it seems the site was penalized again). I checked diagnostics in webmaster tools and neither any malware was detected nor any strange search terms popped up. I read related threads on Google webmasters forum but found none of solutions working for me. I posted a thread on Google webmasters forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=546339f49d4a03bc&hl=en) and the only answer I got was to check for duplicate content. Well, there is some duplicate content published on the web but it is true for vast majority of blogs and it doesn't seem to be a reason for a penalty. Also before Dec 27 I was able to remove duplicate content from a couple of sites which were republishing my feed but this doesn't change the situation - the site was penalized again. The problem is I have no idea what can be wrong with the website or how to find it out. To make the problem worse I'm no webmaster, I just run a wordpress blog, which supposed to be easy.

    Read the article

  • Joomla Secondary Users

    - by Gaz_Edge
    Background I have a joomla based application. My customers sign up and they register as a user on the site. My customers (primary customers) then have their own space on the site that they can then setup their own customers (secondary customer). Question/Problem The problem I am having is that I need to tag each secondary customer to a primary customer. I thought about just creating a new component and having a separate table that includes all the secondary customers. The problem is that I then lose out on all the authentication, session handling and login/logout that the core joomla _users component offers. I then thought about just having all the users in the core _users table and add the primary customer associated with each secondary customer to a field in a profile plugin. This would work for the most part, but this means that primary customers cannot create a secondary customer with a user name that already exists in the _users table. I didn't think this would be an issue, but several of my primary customers (currently only test users) have been confused by the site telling them a username is not available, since they can only see the names of their own secondary customers. Any ideas on some architectural changes I could make to solve this?

    Read the article

  • BGP Router reccomendations for simple redundancy [closed]

    - by Jona
    We have two sites that each have an internet connection and have a dedicated dark fibre between them. Each site has it's own IP space and we have an AS number. We're looking to be resilient to failure of the internet connection to either site and so need to buy a pair of approriate routers. Requirements are: Able to run 2 bgp sessions (one with the ISP, one with the other site router) Option to take a full table from the upstream ISPs would be nice. Able to provide HA gateways on the LAN side (e.g. 192.168.0.254 will automatically migrate if it's host router lost power) A dedicated device rather than a server running Linux / BSD Not crazy expensive. Any help / advice much appreciated.

    Read the article

  • Apache htaccess with mod_expires Not Working for certain directories

    - by keyboarddrummer
    I have a Joomla site that I am trying to enable caching using mod_expires. I have the .htaccess in the root of the site and have added the options as found on the page http://www.pactsoftware.nl/tools/joomla-optimization.html Using the PageSpeed extension in Chrome, prior to adding this in my .htaccess, my site scores a 55 (Caching is at the top, and lists a lot of images, CSS, and JS files). After these directives, it scores 70, with caching in the yellow, but still lists some image files (some are two directories deep and the rest are four). I checked for any other .htaccess files in the Joomla root, but none are between those folders and the root. It is almost as if htaccess only works in that one directory, not the subfolders. I have tried putting a .htaccess in each affected subdirectory, but it does not work. Does anyone have any ideas?

    Read the article

  • No Obvious Answer - Query-Strings and Javascript

    - by nchaud
    Say I have this main page /my-site/all-my-bath-soaps which lists all my products. It has a search filter text box that uses javascript to filter the products they want to see on that page (the URL doesn't change as they filter). Now from many other parts of the site I want to navigate to this products-page and see specific products. E.g. <a href="/my-site/all-my-bath-soaps?filter='Nivea-Soap'"> will go to /all-my-bath-soaps and apply javascript filtering to see just that product and hide all dom nodes for the other products. The problem is if the user changes the text in the filter from 'Nivea-Soap' to 'Lynx' the javascript will work fine and show the new products but the URL stays at ?filter='Nivea-Soap'. Is there anything I can do about this? Of course, I don't want to reload the page with a new query string every time they change the search criteria. Somehow it'd be great to move the ?filter=... criteria into POST data instead - but how can I do this with a link I don't know...

    Read the article

  • What does Enable/Disable mean in Bing's URL Normalization feature?

    - by DisgruntledGoat
    I'm in Bing Webmaster Tools, under Index URL Normalization. Many parameters are listed in the table with 3 other columns: Status, Source, Date. The "Source" column says "Webmaster" where I have added parameters, and "Bing" where I assume the parameter has been auto-detected. "Date" is probably the last date it detected the parameter. I've tried searching the help files but I can't find what the Status column means. The top of the page says: This feature allows you to specify query parameters for Bing’s crawler to ignore. But it's not clear whether "Enable" or "Disable" is related to this, and if so what happens in each case. Does anyone know?

    Read the article

  • Chrome - Why am I automatically authenticated to a web app even after clearing browser cookies?

    - by Howiecamp
    I am accessing a web application using Chrome. If I sign out of the app and clear all Chrome history/cookies/etc (even Flash cookies which are now handled by Chrome in the same Clear History area) and then re-access the site, I am automatically logged in without being prompted for credentials. I then launched Chrome in Incognito mode and was able to reproduce the same behavior. However, the I was prompted upon the first logon while in Incognito mode. The web application behaves as expected in Internet Explorer 10. Some info about the application: It's a Sharepoint site using NTLM authentication The credentials are Active Directory-based, as the username is domain\username My connection is over the Internet and there is no AD relationship between my local Windows account, my Windows PC. In other words I (meaning my locally logged on user and my PC) are not in any way part of their AD domain. The site is running SSL on port 443 Why might Chrome be automatically authenticating me?

    Read the article

  • IIS FTP service - download timeouts and restarts getting the data twice

    - by accel229
    We have an IIS FTP site on a Windows Server 2003 x64 machine. Application Layer Gateway service is disabled (so http://support.microsoft.com/kb/931130 does not apply). Windows Firewall service is disabled as well. Connection timeout for the FTP site (there is only one) is set to 1,200 seconds = 20 minutes. An external client can connect to the site, list directory contents and download small files. When a client attempts to download a large file (eg, if the download continues for 3 minutes, which is still under 20 minutes, but relatively long), the server sends all data, then the connection times out, the client issues REST / RETR commands attempting to restart the download since after the last byte (which I believe should succeed and receive exactly 0 bytes), and the server behaves as if the client tried to restart after byte 0, that is, it sends the entire file all over. Any ideas on how to fix this?

    Read the article

  • Where is IIS7 redirect occuring?

    - by neildeadman
    I have a site that is set up in IIS7 and is working (although I don't understand how). The binding is www.mysite.fr on port 80. In the root of the site there are no files but several folders. The site loads fine but I don't understand how it is loading? If I go to https://www.mysite.fr/ it redirects to https://www.mysite.fr/fr/ which redirects back to https://www.mysite.fr/ So we end up in an endless loop which fails. Using fiddler it is a 301 redirect but I have no idea where this is set! I guess it might be in the website code but as I don't know which file is loaded first, I don't know where to look! Any ideas?? All other sites are working fine...

    Read the article

< Previous Page | 316 317 318 319 320 321 322 323 324 325 326 327  | Next Page >