Search Results

Search found 11753 results on 471 pages for 'technical links'.

Page 242/471 | < Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >

  • why does my google chat get blocked (by corporate firewall) somedays but not others?

    - by Peter
    I have noticed that some days I am able to chat while using Gmail, and other days I am not. It would make sense to me that I would either always be blocked, or never. But I can't figure out why it seems to change daily or weekly. Is Google constantly changing the URLs involved so that the censoring companies (they use websense where I work) have to play catch up? Or is there some other reason I'm missing? I am more interested in the technical reason it is might be happening than in an actual work around.

    Read the article

  • Creating my own simple CMS for asp.net-mvc?

    - by coure06
    I want to create a simple CMS for my asp.net-mvc site. Needs some help to start. Will i save my whole page to db? what if my page contain links like Url.Content("~/somepage") When the admin will edit the page he will get the plain link not the Url.Content. How i can handle this in CMS?

    Read the article

  • please help me to choose good boks on algorithms

    - by davit-datuashvili
    i want to help me to choose good books on algorithms many people from this site say me that show me your code and now i ask u to help me to choose good books on algorithms please i have not books on algorithms and in case i decide to buy it of course must buy book which has high quality yes? so please any ideas ?links everything

    Read the article

  • jQuery: click() not working in IE 7

    - by Patrick
    hello, I cannot make the click() function work in IE7, for the tags links on the top of the page in this website: http://www.sanstitre.ch/drupal/portfolio?tid[0]=38 Everything works perfectly in other browsers and, the z-index of the header is bigger than the rest of the content. thanks

    Read the article

  • XPATH remove attribute

    - by David
    Hi does anyone know hwo to remove an attrbute using xpath. In particular the rel attribute and its text from a link. i.e. <a href='http://google.com' rel='some text'>Link</a> and i want to remove rel='some text'. There will be multiple links in the html i am parsing.

    Read the article

  • Why is My Google Chat get Blocked (by corporate firewall) somedays but not others? [closed]

    - by Peter
    I have noticed that some days I am able to chat while using Gmail, and other days I am not. It would make sense to me that I would either always be blocked, or never. But I can't figure out why it seems to change daily or weekly. Is Google constantly changing the URLs involved so that the censoring companies (they use websense where I work) have to play catch up? Or is there some other reason I'm missing? I am more interested in the technical reason it is might be happening than in an actual work around.

    Read the article

  • Slideshare , API feed for javascript , jquery ...

    - by Alexander Corotchi
    Hi everybody, Can you help me with some information, to make it faster ? Can I use API feed from slideshare with JavaScript (jquery) ? Here, http://www.slideshare.net/developers/documentation I see just "Response XML Format" don't see any Json Response. Can somebody help me with that, some helpful links, or some suggestions, how can I use "Slideshare" API feeds with JavaScript. Thanks A lot !!!!!

    Read the article

  • hidden folders in Internet

    - by lego69
    very often in Internet I see links like this: www.abcde.com/~main/material/hello and this part ~main/material/hello is grey, if I remove hello I receive access forbidden, can somebody explain, what is this system, and is it possible receive access?

    Read the article

  • Display database fields using first,1,2,..., last

    - by user195257
    Hello, Im trying to figure out how to achieve this: http://t-webdesign.co.uk/projects/geusa/industry_jobs.html Evenutually there will be lots of job postings under different industries, what would be the best way of implementing the 'page' links so i can display just 7 or 8 jobs at a time? Thank you

    Read the article

  • Display Friendly Date, Numbers

    - by user310657
    I am searching for a custom control for asp.net, which helps display user friendly dates, like instead of article date: (2-april-2010) it displays (2 months old) I am unable to find it on google, please can any one suggest links, custom controls, articles for the same. Thank you.

    Read the article

  • Networking with extremely high latency.

    - by BCS
    Are there any protocols, systems, etc. experimental or otherwise designed for allowing normal (as normal as can be) network operations (E-mail, DNS, HTML, etc.) over very high latency links? I'm thinking of minutes to an hour, or maybe two. Think light speed lag at a solar system scale.

    Read the article

  • Data extract from website URL

    - by user2522395
    From this below script I am able to extract all links of particular website, But i need to know how I can generate data from extracted links especially like eMail, Phone number if its there Please help how i will modify the existing script and get the result or if you have full sample script please provide me. Private Sub btnGo_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnGo.Click 'url must be in this format: http://www.example.com/ Dim aList As ArrayList = Spider("http://www.qatarliving.com", 1) For Each url As String In aList lstUrls.Items.Add(url) Next End Sub Private Function Spider(ByVal url As String, ByVal depth As Integer) As ArrayList 'aReturn is used to hold the list of urls Dim aReturn As New ArrayList 'aStart is used to hold the new urls to be checked Dim aStart As ArrayList = GrabUrls(url) 'temp array to hold data being passed to new arrays Dim aTemp As ArrayList 'aNew is used to hold new urls before being passed to aStart Dim aNew As New ArrayList 'add the first batch of urls aReturn.AddRange(aStart) 'if depth is 0 then only return 1 page If depth < 1 Then Return aReturn 'loops through the levels of urls For i = 1 To depth 'grabs the urls from each url in aStart For Each tUrl As String In aStart 'grabs the urls and returns non-duplicates aTemp = GrabUrls(tUrl, aReturn, aNew) 'add the urls to be check to aNew aNew.AddRange(aTemp) Next 'swap urls to aStart to be checked aStart = aNew 'add the urls to the main list aReturn.AddRange(aNew) 'clear the temp array aNew = New ArrayList Next Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String) As ArrayList 'will hold the urls to be returned Dim aReturn As New ArrayList Try 'regex string used: thanks google Dim strRegex As String = "<a.*?href=""(.*?)"".*?>(.*?)</a>" 'i used a webclient to get the source 'web requests might be faster Dim wc As New WebClient 'put the source into a string Dim strSource As String = wc.DownloadString(url) Dim HrefRegex As New Regex(strRegex, RegexOptions.IgnoreCase Or RegexOptions.Compiled) 'parse the urls from the source Dim HrefMatch As Match = HrefRegex.Match(strSource) 'used later to get the base domain without subdirectories or pages Dim BaseUrl As New Uri(url) 'while there are urls While HrefMatch.Success = True 'loop through the matches Dim sUrl As String = HrefMatch.Groups(1).Value 'if it's a page or sub directory with no base url (domain) If Not sUrl.Contains("http://") AndAlso Not sUrl.Contains("www") Then 'add the domain plus the page Dim tURi As New Uri(BaseUrl, sUrl) sUrl = tURi.ToString End If 'if it's not already in the list then add it If Not aReturn.Contains(sUrl) Then aReturn.Add(sUrl) 'go to the next url HrefMatch = HrefMatch.NextMatch End While Catch ex As Exception 'catch ex here. I left it blank while debugging End Try Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String, ByRef aReturn As ArrayList, ByRef aNew As ArrayList) As ArrayList 'overloads function to check duplicates in aNew and aReturn 'temp url arraylist Dim tUrls As ArrayList = GrabUrls(url) 'used to return the list Dim tReturn As New ArrayList 'check each item to see if it exists, so not to grab the urls again For Each item As String In tUrls If Not aReturn.Contains(item) AndAlso Not aNew.Contains(item) Then tReturn.Add(item) End If Next Return tReturn End Function

    Read the article

  • css issue on hover- shaky effect

    - by Sarika Thapaliya
    <style type="text/css"> .linkcontainer{border-right: solid 0.2px white;margin-right:1px} .hardlink{color: #FFF !important; border: 1px solid transparent; } .hardlink:hover{ background:url("/_layouts/images/bgximg.png") repeat-x -0px -489px; display:inline-block; background-color:#21374C; border:0.2px solid #5badff; line-height:20px; text-decoration:none !important;} </style> <div style="padding-bottom:3px;background:transparent; color:white!important; float:left; margin-right:20px; line-height:42px;"> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px;" href="http://hronline">HROnline</a> </span> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px; " href="http://hronline/ec">Employee Center</a> </span> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px; " href="http://hronline/businesscommunities">Business Communities</a> </span> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px;" href="http://hronline/internalservices">Internal Services</a> </span> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px;" href="http://hronline/policiesprocedures">Policies&procedures</a> </span> <span class="linkcontainer"> <a class="hardlink" style="padding:0 10px;" href="http://hronline/qualitybestpractices">Best Practices</a> </span> </div> I added a right border to the span that contain menu links. When I hover on each menu links, it also has some background. This is causing jerky effect on the whole container.. What is causing the shaky effect on hover? I don't seem to figure it out--again..

    Read the article

  • The least amount of code possible for this MySQL query?

    - by ddan
    I have a MySQL query that: gets data from three tables linked by unique id's. counts the number of games played in each category, from each user and counts the number of games each user has played that fall under the "fps" category It seems to me that this code could be a lot smaller. How would I go about making this query smaller. http://sqlfiddle.com/#!2/6d211/1 Any help is appreciated even if you just give me links to check out.

    Read the article

  • Need to redirect to true root folder

    - by Brad
    I am running a website on MAMP, and the root is http://localhost/sandbox When I have links that link to, for example - /calendar it directs them to localhost/calendar, I want it to redirect to localhost/sandbox/calendar What would I have to do in htaccess to get it to redirect everything to localhost/sandbox/ as the root?

    Read the article

< Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >