Search Results

Search found 50980 results on 2040 pages for 'http compression'.

Page 601/2040 | < Previous Page | 597 598 599 600 601 602 603 604 605 606 607 608  | Next Page >

  • Mapstraction: Changing an Icon's image URL after it has been added?

    - by Paul Owens
    I am trying to use marker.setIcon() to change a markers image. However it appears that although this changes the marker.iconUrl attribute the icon itself is using marker.proprietary_marker.$.icon.image to display the markers image - so the markers icon remains unchanged. Is there a way to dynamically change the marker.proprietary_marker.$.icon.image? Add a marker. Check the icon's image url and the proprietary icon's image - they're the same. Change the icon. Again check the Urls. Now the Icon Url has changed but the marker still shows the old image which is in the proprietary marker object. <head> <title>Map Test</title> <script src="http://maps.google.com/maps?file=api&v=2&key=Your-Google-API-Key" type="text/javascript"></script> <script src="mapstraction.js"></script> <script type="text/javascript"> var map; var marker; function getMap(){ map = new mxn.Mapstraction('myMap','google'); map.setCenterAndZoom(new mxn.LatLonPoint(45.559242,-122.636467), 15); } function addMarker(){ marker = new mxn.Marker(new mxn.LatLonPoint(45.559242, -122.636467)); marker.addData({infoBubble : "Text", label : "Label", marker : 4, icon: "http://mapscripting.com/examples/mashups/richter-high.png"}); map.addMarker(marker); } function changeIcon(){ marker.setIcon("http://assets1.mapufacture.com/images/markers/usgs_marker.png"); } function showIconURL(){ alert(marker.iconUrl); } function showProprietaryIconURL(){ alert(marker.proprietary_marker.$.icon.image); } </script> </head> <body onload="getMap()"> <div id="myMap" style="width:627px; height:412px;"></div> <div> <input type="button" value="add marker" OnClick="addMarker();"> <input type="button" value="change icon" OnClick="changeIcon();"> <input type="button" value="show icon URL" OnClick="showIconURL();"> <input type="button" value="show proprierty icon URL " OnClick="showProprietaryIconURL();"> </div> </body> </html>

    Read the article

  • Help with making a C# P2P Chat Program

    - by Sandeep Bansal
    Hi everyone, I want to make a P2P Chat client, all I want it to do is to be able to send text across to each peer. I looked at a Chat Client from this example: http://www.geekpedia.com/tutorial239_Csharp-Chat-Part-1---Building-the-Chat-Client.html And am wondering if it can be converted to a p2p program? If so how can it be and can someone provide some code as it will help a lot. If it can't how can I make a really simple p2p chat program? Codes and examples will be very helpful. btw I did look at this article, but it didn't help me: http://msdn.microsoft.com/en-us/library/ms751502.aspx

    Read the article

  • Is this a File Header / Magic Number?

    - by Hammer Bro.
    I've got 120,000 files (way more, actually; this is just an arbitrary subset) of an unknown type. Linux file does not identify them (not that they're necessarily Linux files), nor do any other methods I've tried. There are only two hints about them that I currently have. One is that I suspect some compression is employed -- I have metadata that claims the file sizes are always some amount larger than what I observe. The other is that in 100,000 of these files, the first 16 bytes are always: ff ee ee dd 00 00 00 00 01 00 00 00 00 00 00 00 That really looks like a file header/magic number to me, but I just can't place it. Does anyone know what kind of files this would indicate? Alternatively, can anyone convince me that these suspiciously common bytes certainly do not indicate a specific file type? UPDATE I don't know the exact reverse-engineering details, but most of the files in our case are zips after the first 29(? or so) bytes are ignored. So in practice the problem is solved (we know how to process the files) but in theory the question is still unanswered -- I don't know which application routinely prepends about 29 bytes to its zips. [I'm not sure if I should leave the question open or not at this point.]

    Read the article

  • Parsing SOAP response using libxml in Ruby

    - by abhishektiwari
    I am trying to parse following SOAP response coming from Savon SOAP api <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <ns:getConnectionResponse xmlns:ns="http://webservice.jchem.chemaxon"> <ns:return> &lt;ConnectionHandlerId>connectionHandlerID-283854719&lt;/ConnectionHandlerId> </ns:return> </ns:getConnectionResponse> </soapenv:Body> </soapenv:Envelope> I am trying to use libxml-ruby without any success. Basically I want to extract anything inside tag and the connectionHandlerID value.

    Read the article

  • Lotus Domino - DAOS not reducing file size?

    - by SydxPages
    I have implemented DAOS on a Lotus Domino Server (8.5.3 FP2) as follows: Lotus Domino Server Document: Store file attachments in DAOS: Enabled Minimum size of object before Domino will store in DAOS: 64000 bytes DAOS base path: E:\DAOS Defer object deletion for: 30 days Transaction logging is running, and the specific test database has the following advanced properties set: Domino Attachment and Object Service (ticked) Use LZ1 compression for atachments Compress Database Design Compress Data I have restarted the server. When I run a compact -c, it compacts the database, but does not reduce the size. I have checked the DB in Windows Explorer (60Gb) and the size is the same pre and post. I have checked the directory (E:\DAOS) and it is 35Gb in size. When I run the command 'Tell DAOSMgr Status tmp\test.nsf', I get the following response. From looking up on the net, I believe ticket count = 0 means that the db is not really DAOS'ed? Admin Process: Searching Administration Requests database DAOSMGR: Status tmptest.nsf started DAOS database status: Database: E:\Lotus\Domino\Data\tmp\test.nsf Database state = Synchronized Last resynchronized: 03/09/2012 02:49:13 PM Ticket count: 0 DAOSMGR: Status tmp\test.nsf completed I have run fixup on the database. When I have tried to run the DAOS estimator it has always crashed. This was a problem with larger databases on earlier versions of domino, but not anymore. Can anyone tell me why the size has not reduced? Am I missing anything?

    Read the article

  • Can I retain a Google apps session token permanently for a specific user who logs into my google app

    - by Ali
    Hi guys, is it possible to retain upon authorization a single session token for a user who signs into my gogle application. CUrrently my application seems to every now and then require the user to authenticate into google apps. I think it has to do with session dying out or so. I have the following code: function getCurrentUrl() { global $_SERVER; $php_request_uri = htmlentities(substr($_SERVER['REQUEST_URI'], 0, strcspn($_SERVER['REQUEST_URI'], "\n\r")), ENT_QUOTES); if (isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') { $protocol = 'https://'; } else { $protocol = 'http://'; } $host = $_SERVER['HTTP_HOST']; if ($_SERVER['SERVER_PORT'] != '' && (($protocol == 'http://' && $_SERVER['SERVER_PORT'] != '80') || ($protocol == 'https://' && $_SERVER['SERVER_PORT'] != '443'))) { $port = ':' . $_SERVER['SERVER_PORT']; } else { $port = ''; } return $protocol . $host . $port . $php_request_uri; } function getAuthSubUrl($n=false) { $next = $n?$n:getCurrentUrl(); $scope = 'http://docs.google.com/feeds/documents https://www.google.com/calendar/feeds/ https://spreadsheets.google.com/feeds/ https://www.google.com/m8/feeds/ https://mail.google.com/mail/feed/atom/'; $secure = false; $session = true; //echo Zend_Gdata_AuthSub::getAuthSubTokenUri($next, $scope, $secure, $session);; return Zend_Gdata_AuthSub::getAuthSubTokenUri($next, $scope, $secure, $session).(isset($_SESSION['domain'])?'&hd='.$_SESSION['domain']:''); } function _regenerate_token() { global $BASE_URL; if(!$_SESSION['token']) { if(isset($_GET['token'])): $_SESSION['token'] = Zend_Gdata_AuthSub::getAuthSubSessionToken($_GET['token']); return; else: _regenerate_sessions(); _redirect(getAuthSubUrl($BASE_URL . '/index.php?'.$_SERVER['QUERY_STRING'])); endif; } } _regenerate_token(); I know I'm doing it all wrong here and I don't know why :( I have a CONSUMER SECRET code but only use it whereever I need to access a google service. However something is wrong with my authentication as the user has to periodically 'grant access to my application' and reauthorise himself... help please

    Read the article

  • Create signed urls for CloudFront with Ruby

    - by wiseleyb
    History: I created a key and pem file on Amazon. I created a private bucket I created a public distribution and used origin id to connect to the private bucket: works I created a private distribution and connected it the same as #3 - now I get access denied: expected I'm having a really hard time generating a url that will work. I've been trying to follow the directions described here: http://docs.amazonwebservices.com/AmazonCloudFront/latest/DeveloperGuide/index.html?PrivateContent.html This is what I've got so far... doesn't work though - still getting access denied: def url_safe(s) s.gsub('+','-').gsub('=','_').gsub('/','~').gsub(/\n/,'').gsub(' ','') end def policy_for_resource(resource, expires = Time.now + 1.hour) %({"Statement":[{"Resource":"#{resource}","Condition":{"DateLessThan":{"AWS:EpochTime":#{expires.to_i}}}}]}) end def signature_for_resource(resource, key_id, private_key_file_name, expires = Time.now + 1.hour) policy = url_safe(policy_for_resource(resource, expires)) key = OpenSSL::PKey::RSA.new(File.readlines(private_key_file_name).join("")) url_safe(Base64.encode64(key.sign(OpenSSL::Digest::SHA1.new, (policy)))) end def expiring_url_for_private_resource(resource, key_id, private_key_file_name, expires = Time.now + 1.hour) sig = signature_for_resource(resource, key_id, private_key_file_name, expires) "#{resource}?Expires=#{expires.to_i}&Signature=#{sig}&Key-Pair-Id=#{key_id}" end resource = "http://d27ss180g8tp83.cloudfront.net/iwantu.jpeg" key_id = "APKAIS6OBYQ253QOURZA" pk_file = "doc/pk-APKAIS6OBYQ253QOURZA.pem" puts expiring_url_for_private_resource(resource, key_id, pk_file) Can anyone tell me what I'm doing wrong here?

    Read the article

  • integriting facebook login button with Facebooker (rails plugin)

    - by dexterdeng
    I was integriting login-button with Facebooker, as I wanted to use facepile and customise the facebook login button, so I have to use facebook js sdk. I used the facebooker to connect facebook. now I found a issue. window.fbAsyncInit = function() { FB.init({ appId: '<%=Facebooker.api_key%>', status: true, cookie: true, xfbml: true }); }; (function() { var e = document.createElement('script'); e.type = 'text/javascript'; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; e.async = true; document.getElementById('fb-root').appendChild(e); }()); function fblogin(){ var pearms = "email,user_birthday,friends_location,offline_access,publish_stream,read_friendlists,user_birthday,user_location"; FB.getLoginStatus(function(response) { if (response.session) { // logged in and connected user, someone you know window.location = "http://domain/account/link_user_accounts"; return true; } else { // no user session available, someone you dont know FB.login(function(response) { if (response.session) { if (response.perms) { // after logged in the facebook account. $.inspect(response.perms);//return all these perms I expected. it should be fine there. window.location = "http://domain/account/link_user_accounts"; } return true; } else { return false; } },"email,user_birthday,friends_location,offline_access,publish_stream,read_friendlists"); } }) }; Let's say if the api_key is "1111111111". take a look at this line: " ` if (response.session) { if (response.perms) { $.inspect(response.perms); " now I was trying to login , call fblogin() , I'm sure that the response.perms equal to the perms I expected. (btw, at that time, I have a facebook plugin named facepile works too, it showed my friends after I called fblogin() and connected to facebook by typing my email and password ). so now it should run window.location = "http://domain/account/link_user_accounts"; yes, this line run. but the facebook_session can't build successfully. after digging the facebooker's code, I found this from the rails plugin facebooker: def create_facebook_session secure_with_facebook_params! || secure_with_cookies! || secure_with_token! end mostly, it would run secure_with_cookies! , and if the cookies with keys as "fbs_#{Facebooker.api_key}","#{Facebooker.api_key}_ss", "#{Facebooker.api_key}_session_key",.. created, then the facebook_session can be created. but these cookies can't be created after I logged in facebook until I refresh the current page by hand . I noticed if I refresh the page, the cookies with these keys added to the browser. but why they can't be added after I logged in facebook at once? I need these keys to create facebook_session. did I forgot something excepted these code I pasted? anybody help? thank you very much!

    Read the article

  • TinyMce + Ajax File Manager + Codeigniter = Little Problem

    - by lucha libre
    OK, I'm using the following: TinyMCE, CodeIgniter, and the TinyMCE Ajax File Manager. I can upload correctly and it looks pretty good. However, when I view the HTML (from TinyMCE), this is what I get. <img src="../../../data/page/verde_enfemera.jpg" alt="" /> What I need to be getting is the following: <img src="http://localhost/http/data/page/verde_enfemera.jpg" alt="" /> Can someone help? EDIT: I changed the code in the HTML editor of Tinymce, then I saved it. When I re-opened it, the code was reverted back to the original "../.../../data", etc. please, help, someone.

    Read the article

  • ckeditor: toggle button in facelets

    - by Shilpa
    I am trying to toggle between CKEditor and textarea in a facelet(.xhtml) file. I have used the same code in a jsp file and it works fine. But in .xhtml file its not doing the toggle between ckeditor and plain editor.It loads ckeditor both the times.Can anyone please let me know what am I missing. Code of xhtml file: <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:ckeditor="http://ckeditor.com"> <head> <title>Welcome PAge</title> <script type="text/javascript" src="ckeditor/ckeditor.js"></script> <script type="text/javascript" src="ckeditor/adapters/jquery.js"></script> <script type="text/javascript" src="ckeditor/config.js"></script> </head> <body> <div>Welcome Page!!</div> <h:form> <center><p><h:outputText value="#{userBean.username} logged in"/></p></center> <center> <p> Questions: <h:inputTextarea id="editor1" class="ckeditor" rows="20" cols="75" /> <br></br> </p> </center> <h:commandButton value="Ckeditor" onclick="ckeditor.replace('editor1');" /> <h:commandButton value="Text editor" onclick="ckeditor.instances.editor1.destroy();" /> <h:commandButton value="Get Data" onclick="alert(ckeditor.instances.editor1.getData());" /> <br></br> <br></br> </h:form> </body> </html> Thanks in advance, Shilpa

    Read the article

  • Powershell Copy-Item fails silently

    - by R W
    I have a powershell 2.0 script running on Windows Server 2008 R2 64bit that copies some Hyper-V .vhd files to another server as a 'backup solution'. The script gets a list of the .vhd's to copy then iterates over that list to copy them using Copy-Item. It also writes some logging info to a file as well. The files are copied to another server (Windows Server 2003 Sp2) into a directory compressed with NTFS compression. One of the files isn't copied. It's relatively big ~ 68Gb. The others are 20Gb or less. The wierd thing is that during the copy process the file appears on the destination server and the log file generated seems to indicate the file is copied due to the difference in the times of the log file entries. I see no error messages on the log file and nothing in the event log of either machine. Here's the code that does the copy. Get-ChildItem $VMSource *.vhd -Recurse | foreach-object { $time = Get-Date -format HH.mm.ss Add-Content $logFileName "$time : File Copy ($_) started" $fullname = $_.FullName Add-Content $logFileName "$time : Copying $fullname to $VMDestination" Copy-Item $fullname $VMDestination -Force -ErrorAction SilentlyContinue -ErrorVariable errors foreach($error in $errors) { if ($error.Exception -ne $null) { Add-Content $logFileName "'tERROR COPYING FILE : $($error.Exception)" } } $time = Get-Date -format HH.mm.ss Add-Content $logFileName "$time : File Copy ($_) finished" } I can only think there's some problem with copying a file that big to a compressed directory maybe? Any ideas?

    Read the article

  • how to access webservice from one project to another project

    - by prince23
    hi, i have an project with name called(dbservice layer) which is in path: d:\webservice\DBService. here i have an webservice which connects to DB and returns an object of an class. once i added an reference here i get an url:http://localhost:2371/Jobs.svc now i have another project name (UILayer) whic is in path: E:\School\UILayer i added an service reference here with url as http://localhost:2371/Jobs.svc but i get an messgae telling service is unable why is that happening. if both my webserivce layer and ui layer are in same project. then i able to use the webserive in the ui layer. and get the required output so i wanted to know is there any way we can acesss the webserive from one project to another project thanks in advance prince

    Read the article

  • Ruby CMS/blog: Mephisto vs. Radiant

    - by Candidasa
    I'm looking for a blogging tool with some light CMS features in Ruby on Rails. I mainly want something simple, but configurable. I have no need for page snippets, etc. Just your basic main blog, very good (and easy) theme support, some nice sidebar stuff, a few static pages and MetaWeblog API support. I'm thinking of either using Mephisto or Radiant CMS (everything else seems half-baked or extremely lightweight at best): http://mephistoblog.com/ http://www.radiantcms.org/ Documentation for Mephisto seems very lacking and their site is a mess. I've also read some bad things about it's stability. Radiant seems more stable in comparison and has heaps of useful plug-ins. However, it isn't designed for blogging out of the box. That has to be added as almost an after thought. Creating a custom theme also seems more cumbersome with Radiant due to the sub-page/snippet feature. Which should I choose?

    Read the article

  • can router configuration cause decreasing of download rate?

    - by Behrooz
    my download speed got crazy since i changed the routers IP. but nothing got fixed after "reset factory"ing it. the speed was 1024kb/s(128kB/s) but it is 200kb/s(max) right now. i mean it works good if a request is small(i.e. a HTTP request) but it gets slow if a request has a big response. help me please(it is three days I'm downloading VS2010) http://serverfault.com/questions/135243/ no one on serverfault helped me for posting my question please migrate it to serverfault. thanks.

    Read the article

  • org.apache.jasper.JasperException: java.lang.NullPointerException

    - by Br3x
    i got this exception everytime i try to load my webapp: exception org.apache.jasper.JasperException: java.lang.NullPointerException org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:538) org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:370) org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313) org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) root cause java.lang.NullPointerException org.apache.jsp.index_jsp._jspInit(index_jsp.java:22) org.apache.jasper.runtime.HttpJspBase.init(HttpJspBase.java:52) org.apache.jasper.servlet.JspServletWrapper.getServlet(JspServletWrapper.java:164) org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:340) org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313) org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) how can i solve this exception?

    Read the article

  • Error premature end of file pops up when accessing a URL

    - by kayteen
    Hi, I am using Coldfsuion 8.0.1 and Solaris 10 and when i try to run this URL, http://IPADDRESS/flex2gateway/http I am receiving an error message "Premature end of file". Please help me out if i am missing any installation/fix. Error details: [Flex] Premature end of file. flex.messaging.MessageException: Premature end of file. at flex.messaging.io.amfx.AmfxMessageDeserializer.fatalError(AmfxMessageDeserializer.java:249) at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) at javax.xml.parsers.SAXParser.parse(SAXParser.java:395) at javax.xml.parsers.SAXParser.parse(SAXParser.java:198) at flex.messaging.io.amfx.AmfxMessageDeserializer.parse(AmfxMessageDeserializer.java:103) at flex.messaging.io.amfx.AmfxMessageDeserializer.readMessage(AmfxMessageDeserializer.java:90) at flex.messaging.endpoints.amf.SerializationFilter.invoke(SerializationFilter.java:113)

    Read the article

  • How to generate complex url like stackoverflow?

    - by Freewind
    I'm using playframework, and I hope to generate complex urls like stackoverflow. For example, I want to generate a question's url: http://aaa.com/questions/123456/How-to-generator-a-complex-url Note the last part, it's the title of the question. But I don't know how to do it. UPDATED In the playframework, we can define routes in conf/routes file, and what I do is: GET /questions/{<\d+>id} Questions.show In this way, when we call @{Questions.show(id)} in views, it will generate: http://aaa.com/questions/123456 But how to let the generated has a title part, is difficult.

    Read the article

  • Mvc relative path using virtual directory..help!

    - by kevin
    When i drag and drop my image/script/css file into my view, relative path will automatically use to refer on the files. example: <link href="../../Content/style.css" rel="stylesheet" type="text/css" /> <script src="../../Scripts/jquery-min.js" type="text/javascript"></script> <img src="../../Images/logo.jpg" /> It is working fine when i host it on my root directory, but if i'm using virtual directory then only my css file able to refer correctly, the rest will return 404...as it will refer to http://{root}/Images/logo.jpg rather than http://{root}/{virtual directory}/Images/logo.jpg But why css file is working? and how to specify the relative path correctly for both root & virtual directory cases?

    Read the article

  • How do I use Python's itertools.groupby()?

    - by James Sulak
    I haven't been able to find an understandable explanation of how to actually use Python's itertools.groupby() function. What I'm trying to do is this: take a list - in this case, the children of an objectified lxml element - divide it into groups based on some criteria, and then later iterate over each of these groups separately. I've reviewed the documentation (http://docs.python.org/lib/itertools-functions.html), and the examples, (http://docs.python.org/lib/itertools-example.html), but I've had trouble trying to apply them beyond a simple list of numbers. So, how do I use of itertools.groupby()? Is there another technique I should be using? Pointers to good "prerequisite" reading would also be appreciated.

    Read the article

  • Slow network file transfer (under 20KB/s) on newly built x64 Win7

    - by Mangoshake
    I am getting <20KB/s for local network file transfer. If I transfer a very small file (less than 100KB) it would start quickly then slow down to <20KB/s. all subsequently network file transfer would be slow, a reboot is needed to reset this. If I transfer a large file it would be stuck on calculating for a long time and then begin with <20KB/s immediately. This is a newly built desktop running Windows 7 x64 SP1. Realtek gigabit LAN from the motherboard (ASRock Extreme3 gen3). Problematic speed is observed on the private LAN, both through ethernet and WiFi. The Router is D-Link DIR-655. Remote Differential Compression is off. Drivers are up-to-date from ASRock's website. I have tested network file transfer to and from another Windows 7 laptop and a MacBook Pro, so I am fairly certain it is the desktop's problem. The slow speed only happens with one direction also, outbound from the desktop, regardless of whether I initiate the file transfer action from the origin or the destination. Inbound network file transfer and internet speeds are fine, so I don't think this is a hardware issue. I am getting 74.8MB/s internet upload speed from speedtest.net (http://www.speedtest.net/result/1852752479.png). Inbound network file transfer I can get around 10-15MB/s. I am hoping this community has some insight for me to troubleshoot this. I don't see anything obviously related from the Event Viewer, and beyond that I just don't know where else to look. Any suggestions are greatly appreciated, thank you in advance.

    Read the article

  • Communicating with web service on SSL

    - by Krt_Malta
    Hi, I have a web service which previously was deployed on http. I used to generate stub classes using wsimport using wsimport http://localhost:8080/MiniForumService/MiniForumService?wsdl. Now I deployed it on SSL. But when I try to generate the stub classes from it using wsimport https://localhost:8443/MiniForumService/MiniForumService?wsdl but I'm getting the following error: unable to find valid certification path to requested target I'm using a self-signed certificate on the server. How can I solve this please? I've googled about but haven't found a solution till now Thanks and regards, Krt_Malta

    Read the article

  • How to compare sqlite TIMESTAMP values

    - by Roel
    I have an Sqlite database in which I want to select rows of which the value in a TIMESTAMP column is before a certain date. I would think this to be simple but I can't get it done. I have tried this: SELECT * FROM logged_event WHERE logged_event.CREATED_AT < '2010-05-28 16:20:55' and various variations on it, like with the date functions. I've read http://sqlite.org/lang_datefunc.html and http://www.sqlite.org/datatypes.html and I would expect that the column would be a numeric type, and that the comparison would be done on the unix timestamp value. Apparantly not. Anyone who can help? If it matters, I'm trying this out in Sqlite Expert Personal.

    Read the article

  • Multiple GET arguments

    - by AJ Ravindiran
    Hello, I've been working with PHP lately, and I came across something I couldn't solve. So basically, I have a form: <form method="get"> <fieldset class="display-options" style="float: left"> Search by name or ip: <input type="text" name="key" value="" />&nbsp; <input type="submit" class="button2" value="Search" /> </fieldset> </form> The problem is, I currently already have a argument: http://example.com/logs.php?type=admin&page=1 How would i pass the given form argument with the already existing arguments? Like so: http://example.com/logs.php?type=admin&page=1&key=name Thanks in advance, AJ.

    Read the article

  • How (and where) to get aligned tRNA sequences (and import it into R)

    - by Tal Galili
    (This is a database / R commands question) I wish (for my thesis work), to import tRNA data into R and have it aligned. My questions are: 1) What resources can I use for the data. 2) What commands might help me with the import/alignment. So far, I found two nice repositories that holds such data: http://trnadb.bioinf.uni-leipzig.de/Resulthttp://trnadb.bioinf.uni-leipzig.de/Result http://gtrnadb.ucsc.edu/download.htmlhttp://gtrnadb.ucsc.edu/download.html And also the readFASTA command from Biostrings, that does basic importing of the data into R. My problem still remains with how to handle the alignment of the tRNA. Since I am not from the field, I might be missing a very basic answer (like where I should download the data from, or what command to use). If you might be willing to advice me, that would be most helpful. Many thanks in advance, Tal

    Read the article

  • Running git-svn with cron results in garbage in .git

    - by Paul
    I've setup a git-svn repo with cron to fetch from the svn repo daily. I have a script to do the fetching, and this is what is invoked by cron. Everything is fine with the repo, and the script works fine when executed manually. However, when it runs under cron, empty files get dropped into the .git directory. The files have names that look like they are some base64 output, e.g. juTrvjP6m8 and kcKf3hu3b4. Two of these files show up for every cron run. I thought these might be commit hashes, but they're not, git-show says it's an unknown revision. I set-up the repo as follows: git svn init http://svn.ip.addr/repo git svn fetch svn-remote My script looks like this: cd /gitsvn/dir git svn fetch svn-remote git svn push pub The last line pushes the repo to a separate (bare) public repo from which others can clone. I'm piping the output from the cron job to a file, which looks like this: fatal: unable to run 'git-svn' Counting objects: 21, done. Delta compression using up to 2 threads. Compressing objects: 100% (10/10), done. Writing objects: 100% (11/11), 59.08 KiB, done. Total 11 (delta 8), reused 0 (delta 0) To /gitpub/repo.git 360faf5..a153b0d trunk -> trunk The line "fatal: unable to run 'git-svn'" is alarming, but the fetch seems to go ahead anyway. Any suggestions? Where are these empty garbage files coming from, and how to stop them? Am I in for bigger problems in the future? BTW, I'm using git 1.6.3.3.

    Read the article

< Previous Page | 597 598 599 600 601 602 603 604 605 606 607 608  | Next Page >