Search Results

Search found 91480 results on 3660 pages for 'large data in sharepoint list'.

Page 1504/3660 | < Previous Page | 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511  | Next Page >

  • sproutcore - todos tutorial, addbutton not responding in firefox

    - by kristian nissen
    I'm testing the sproutcore todo's tutorial and I have checked the code in step-5 and it's identical to my code at least as far as I can see, but the addButton is not responding to click events. addTask: function () { var task; task = Sinatra.store.createRecord(Sinatra.Task, { 'description': 'New Task', 'isDone': false, 'priority': 1 }); this.selectObject(task); this.invokeLater(function () { var contentIndex = this.indexOf(task); var list = Sinatra.mainPage.getPath('mainPane.middleView.contentView') var listItem = list.itemViewForContentIndex(contentIndex); listItem.beginEditing(); }); return YES; and in the main: addButton: SC.ButtonView.design({ layout: { centerY: 0, height: 24, right: 12, width: 100 }, title: 'Add Task', target: 'Sinatra.tasksController', action: 'addTask' }), I can't see the problem, please help. (I have only tested this in firefox on kubuntu)

    Read the article

  • Sphinx search distributed index tuning

    - by Andriy Bohdan
    I'm deciding how to split 3 large sphinx indexes between 3 servers. Each of the 3 indexes is searched separately. What's more effective: to host each index on separate machine Example machine1 - index1 machine2 - index2 machine3 - index3 or to split each index into 3 parts and host each part of the same index on separate machine. Example machine1 - index1_chunk1, index2_chunk1, index3_chunk1 machine2 - index1_chunk2, index2_chunk2, index3_chunk2 machine3 - index1_chunk3, index2_chunk3, index3_chunk3 ?

    Read the article

  • anonymous function variable scope [js, ajax]

    - by arthurprs
    $(".delete").click( function() { var thesender = this; $(thesender).text("Del..."); $.getJSON("ajax.php", {}, function(data) { if (data["result"]) $(thesender).remove(); // variable defined outside else alert('Error!'); } ); return false; } ); This can cause problems if user clicks on another ".delete" before the ajax callback is called?

    Read the article

  • Why does deploying a .NET Compact Framework assembly cause .NET Desktop Framework assemblies to be d

    - by Matthew Belk
    I am trying to get one of my developers set up to work on a fairly large .NETCF project. When we try to simply deploy the solution and all of its projects to a target device, deploying one of the projects triggers several assemblies from the desktop framework to be copied from the GAC to the device. What on earth could cause this? The assemblies from the "big" framework are ones like System.DirectoryServices, System.Design, and a bunch of others.

    Read the article

  • Asp.net MVC RSS help needed.

    - by coure06
    Following the tutorial at http://www.developerzen.com/2009/01/11/aspnet-mvc-rss-feed-action-result/ My code for the controller is like this, but i am not getting any result from http://www.gadgetfind.com/rss.xml public ActionResult Feed() { SyndicationFeed feed = new SyndicationFeed("Test Feed", "This is a test feed", new Uri("http://www.gadgetfind.com/rss.xml"), "TestFeedID", DateTime.Now); SyndicationItem item = new SyndicationItem("Test Item", "This is the content for Test Item", new Uri("http://www.gadgetfind.com/rss.xml"), "TestItemID", DateTime.Now); List<SyndicationItem> items = new List<SyndicationItem>(); items.Add(item); feed.Items = items; return new RssActionResult() { Feed = feed }; }

    Read the article

  • WCF: parameters handled in custom channel not present in generated WSDL.

    - by vfilby
    I have some special parameters to all my wcf service methods that are handled inside a custom channel and are not exposed in the service method parameter list. This works fine for json/xml endpoints, but the I don't know how to use a SOAP endpoint with this setup because the generated WSDL doesn't include fields that are not in the service call parameter list. Is there a way I can centralize the handling of the special parameters that apply to all service methods (authentication, locale and other contextual information) and provide a SOAP endpoint that Just Works (tm)? Hand editing wsdl files is not an option.

    Read the article

  • Tracking the user function that threw the exception

    - by makerofthings7
    I've been given a large application with only one try..catch at the outer most level. This application also throws exceptions all the time, and is poorly documented. Is there any pattern I can implement that will tell me what user method is being called, the exception being thrown, and also the count of exceptions? I'm thinking of using a dictionary with reflection to get the needed information, but I'm not sure if this will work. What do you think?

    Read the article

  • WCF issues with KnownType for Dictionary

    - by Tom Frey
    Hi, I have a service that implements the following DataMember: [DataMember] public Dictionary<string, List<IOptionQueryResult>> QueryResultItems { get; set; } I have the class "OptionQuerySingleResult" which inherits from IOptionQueryResult. Now, I understand that I need to make the OptionQueryResult type "known" to the Service and thus tried to add the KnownType in various ways: [KnownType(typeof(Dictionary<string, OptionQuerySingleResult[]>))] [KnownType(typeof(Dictionary<string, List<OptionQuerySingleResult>>))] [KnownType(typeof(OptionQuerySingleResult)] However, none of those approaches worked and on the client side I'm either getting that deserialization failed or the server simply aborted the request, causing a connection aborted error. Does anyone have an idea on what's the proper way to get this to work? I'd like to add, the if I change the QueryResultItems definition to use the concrete type, instead of the interface, everything works just fine. Thanks, Tom

    Read the article

  • Xerces SAX parser ignore the xmlxs:xsi attribute as an attribute of an element

    - by user603301
    Hi, Using Xerces SAX parser I try to retrieve all elements and their attributes of this XML file: -------------- Begin XML file to parse ---------------- <?xml version="1.0" encoding="UTF-8"?> <invoice xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="my.xsd"> <parties> (...) -------------- End XML file to parse ---------------- When getting the attributes for the element 'invoice', Xerces++ does not insert the 'xmlns:xsi' attribute in the list of 'Attributes' for the element 'invoice'. However, the attribute 'xsi:noNamespaceSchemaLocation' is inserted in the list. Why? Is there a specific reason from an XML standard point of view ? Is there a way to configure Xerces++ SAX parser so that it inserts this attribute as well? (The documentation on setting the parser properties does not tell how). Thanks for your help.

    Read the article

  • How do I hook into Tar with BASH?

    - by orb
    Long Story Short I am working with Tar archives that contain PNG images in base64 encoding. I would like to use BASH (or whatever else works) to hook into the extraction function of Tar to decode PNG images from base64 encoding to standard PNG encoding after the files are unpacked. A simple cat $input-file | base64 -d >$output-file will successfully decode the images. Is there a way I can hook into tar -xf so that users do not have to do any (or minimal) extra work to decode the images? In the GNU Tar documentation (http://www.gnu.org/software/tar/manual/html_chapter/Backups.html#SEC97) I found that there are in fact variables reserved to hold the names of functions I desire to be hooked into various moments in Tar program execution. However, the documentation explains that these variables, along with other variables that can be set to configure Tar, are located in a file named backup-specs. Unfortunately, the path to this file is not given. Further, running sudo find / -name backup-specs tells me that this file is not present on my Ubuntu version 13.04 system. Background Information not included in the Long Story Short I have been working on a browser-based (WebGL) particle effect creation application (http://www.particleeffect.org), (https://github.com/cgrabowski/webgl-particle-effect-editor), (https://github.com/cgrabowski/webgl-particle-effect). I have began to write a client-side-only solution for saving and loading effect data as a tar archive. However, since client-side JavaScript has limited capability to process binary data, the images used as textures in the effect are saved with base64 encoding. I have been able to implement saving effect data as a Tar archive (haven't pushed that to Github yet). However, the images present in said Tar archive cannot be manipulated unless they are decoded from base64 encoding.

    Read the article

  • SQL string formatter

    - by Paul D. Eden
    Does anyone know of a program, a utility, or some programmatic library, preferably for Linux, that takes an unformatted SQL string and pretty prints it? For example I would like the following select * from users where name = 'Paul' be changed to something like this select * from users where name = 'Paul' The exact formatting is not important. I just need something to take a large SQL string and break it up into something more readable.

    Read the article

  • What eletronic scrum/kanban board do you use and recommend for distributed teams?

    - by Derick Bailey
    I have a coworker on a team that is fairly distributed, fairly large (for our company) and wants to take advantage of visual management tools like scrum / kanban boards. Since they are a somewhat distributed team, though, all of the issue management / work management must be done via an electronic tool (we currently use Trac). What issue / work management tools, with a visualization of a scrum / kanban board, do you use for your distributed scrum / kanban teams? would you recommend it, and if so, why? Thanks.

    Read the article

  • Postfix: Using google apps for stmp errors

    - by Zed Said
    I am using postfix and need to send the mail using google apps smtp. I am getting errors after I thought I had set everything up correctly: May 11 09:50:57 zedsaid postfix/error[22214]: 00E009693FB: to=<[email protected]>, relay=none, delay=2466, delays=2462/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) May 11 09:50:57 zedsaid postfix/error[22213]: 0ACB36D1B94: to=<[email protected]>, relay=none, delay=2486, delays=2482/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) May 11 09:50:57 zedsaid postfix/error[22232]: 067379693D3: to=<[email protected]>, relay=none, delay=2421, delays=2417/3.4/0/0.06, dsn=4.7.0, status=deferred (delivery temporarily suspended: SASL authentication failed; cannot authenticate to server smtp.gmail.com[74.125.155.109]: no mechanism available) main.cf: # Debian specific: Specifying a file name will cause the first # line of that file to be used as the name. The Debian default # is /etc/mailname. #myorigin = /etc/mailname smtpd_banner = $myhostname ESMTP $mail_name (Debian/GNU) biff = no # appending .domain is the MUA's job. append_dot_mydomain = no # Uncomment the next line to generate "delayed mail" warnings #delay_warning_time = 4h readme_directory = no # TLS parameters #smtpd_tls_cert_file=/etc/ssl/certs/ssl-cert-snakeoil.pem #smtpd_tls_key_file=/etc/ssl/private/ssl-cert-snakeoil.key smtpd_use_tls=yes smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache # See /usr/share/doc/postfix/TLS_README.gz in the postfix-doc package for # information on enabling SSL in the smtp client. myhostname = zedsaid.com alias_maps = hash:/etc/aliases alias_database = hash:/etc/aliases myorigin = /etc/mailname mydestination = #relayhost = mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 mailbox_command = procmail -a "$EXTENSION" mailbox_size_limit = 0 recipient_delimiter = + inet_interfaces = all delay_warning_time = 4h smtpd_recipient_limit = 16 # how many error before back off. smtpd_soft_error_limit = 3 # how many max errors before blocking it. smtpd_hard_error_limit = 12 ## Gmail Relay relayhost = [smtp.gmail.com]:587 smtp_use_tls = yes smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd smtp_sasl_security_options = noanonymous smtp_sasl_tls_security_options = noanonymous smtp_sasl_mechanism_filter = login smtp_tls_eccert_file = smtp_tls_eckey_file = smtp_use_tls = yes smtp_enforce_tls = no smtp_tls_CAfile = /etc/postfix/cacert.pem smtpd_tls_received_header = yes tls_random_source = dev:/dev/urandom transport_maps = hash:/etc/postfix/transport debug_peer_list = smtp.gmail.com debug_peer_level = 3 What am I doing wrong?

    Read the article

  • sIFR 3 r436: how to get really big fonts

    - by ploma
    For some reason I can't seem to get sIFR to display fontsize larger than about 126 px. I've tried to change the MAX_FONT_SIZE found in sifr.js, but it's no use. I've also tried adjusting different fontsizes in the css, but it won't go higher than 126px. Does anybody know how to get sIFR to display a really large fontsize? -- Ploma --

    Read the article

  • DataAdapter Select string from base table schema?

    - by MattSlay
    When I built my .xsd, I had to choose the columns for each table, and it made a schema for the tables, right? So how can I get that Select string to use as a base Select command for new instances of dataadapters, and then just append a Where and OrderBy clause to it as needed? That would keep me from having to keep each DataAdapter's field list (for the same table) in synch with the schema of that table in the .xsd file. Isn't it common to have several DataAdapters that work on a certain table schema, but with different params in the Where and OrderBy clauses? Surely one does not have to maintain (or even redundently build) the field list part of the Select strings for half a dozen DataAdapters that all work off of the same table schema. I'm envisioning something like this pseudo code: BaseSelectString = MyTypedDataSet.JobsTable.GetSelectStringFromSchema() // Is there such a method or technique? WhereClause = " Where SomeField = @Param1 and SomeOtherField = @Param2" OrderByClause = " Order By Field1, Field2" SelectString=BaseSelectString + WhereClause + OrderByClause OleDbDataAdapter adapter = new OleDbDataAdapter(SelectString, MyConn)

    Read the article

  • internet-based sync software that will keep running after Windows Live Sync stops doing PC-to-PC-syncs?

    - by Warren P
    According to the wikipedia page, Microsoft Live Sync will shortly stop offering the PC-to-PC sync service. There are lots of apps to sync two PCs on the same LAN, but I want to sync two PCs that are in different cities, across the internet, traversing two different NATs, and that requires some kind of service running in the internet that both connect into. There is already a few questions about syncing folders and files, but this is not a duplicate because none of them answer this basic question: Microsoft Live Sync works better than RSYNC, or any of the linked SYNC solutions in any of the "not really duplicates" because it works even when the two PCs have NAT and firewalls between them that forbid direct connectivity, because Windows Live Sync has a free always-on internet server that all the client PCs connect into. I'm looking for a FREE (no-fees) Microsoft Live Sync work-alike PC-to-PC sync solution that works between PCs and Macs, at least, as well as between PCs, and works behind NAT and firewalls at least as well as Microsoft's solution. (Note that Microsoft's solution makes only outbound socket calls to a microsoft server, so this solution must necessarily include a server-hub component that is hosted publically on a free site and which does not require that I set up and manage and pay for my own public internet hosting site) Hint: None of the answers in the linked duplicate are equivalent (PureSync,FreeFileSync,BestSync 2010,SyncButler,Comodo BackUp,QuickShadow,Gbridge) in that none of them work for the PC to Mac situation, where firewalls and nats prevent direct connection, or else they require money to be paid. When Microsoft Live Sync / Live Mesh finally kills direct PC-to-PC mode, the limitation will be that you will have to pay for more than 25 GB of cloud service, and you can then only sync PC #1 to PC #2 if you first sync to the cloud, then down to other clients. I can currently sync 100 gb of data from one computer to another, only temporarily "moving the data" through Microsoft's data servers without using up my Skydrive storage quota.

    Read the article

  • How to deserialize null array to null in c#?

    - by Aen Sidhe
    Here is my class: public class Command { [XmlArray(IsNullable = true)] public List<Parameter> To { get; set; } } When I serialize an object of this class: var s = new XmlSerializer(typeof(Command)); s.Serialize(Console.Out, new Command()); it prints as expected (xml header and default MS namespaces are omitted): <Command><To xsi:nil="true" /></Command> When I took this xml and tried to deserialize it I got stucked, because it always print "Not null": var t = s.Deserialize(...); if (t.To == null) Console.WriteLine("Null"); else Console.WriteLine("Not null"); How to force deserializer to make my list null, if it is null in xml?

    Read the article

  • Finding related tags using acts-as-taggable-on

    - by user284194
    In tag#show I list all entries with that tag. At the bottom of the page I'd like to have something like: "Related Tags: linked, list, of, related tags" My view looks like: <h2><%= link_to 'Tag', tags_path %>: <%= @tag.name.titleize %></h2> <% @entries.each do |entry| %> <h2><%= link_to h(entry.name), entry %></h2> <%- unless entry.phone.empty? -%> <p><%= h(entry.phone) %></p> <%- end -%> <%- unless entry.address.empty? -%> <p><%= h(entry.address) %></p> <%- end -%> <%- unless entry.description.empty? -%> <p><%= h(entry.description) %></p> <%- end -%> <p2><%= link_to "more...", entry %><p2> <% end %> Related Tags: <% @related.each do |tag| %> <%= link_to h(tag.tags), tag %> <% end %> tags_controller.rb: def show @title = Tag.find(params[:id]).name @tag = Tag.find(params[:id]) @entries = Entry.paginate(Entry.find_tagged_with(@tag), :page => params[:page], :per_page => 10, :order => "name") @related = Entry.tagged_with(@tag, :on => :tags) end Every entry has at least one tag, it's required by the entry model. I'd like duplicate tags ignored and the current tag (the tag that the list belongs to) ignored. My current code displays this: Related Tags: Gardens Gardens ToursGardens Gardens is a link to the entry, not the tag gardens. ToursGardens is a link to the entry that includes those tags. My desired result would be: Related Tags: Gardens, Tours Each link would link to it's associated tag. Can anyone help me achieve this? I tried using a div_for but I don't think that was right.

    Read the article

  • Best Practice : Import CSV to MYSQL Database using PHP 5.x

    - by ArneRie
    Howdy Folks, what is the best solution to import large amounts of records into an MySQL or Oracle Database. I Think there are two ways : Insert every record with an persistent connection Create on BIG SQL-Statement and query the Database Iam not sure wich is performing better ? Is there any Best Practice Solutions for this kind of operation ??

    Read the article

  • Ajax request with JQuery on page unload

    - by Rob
    I'm trying to do this: $(window).unload( function () { $.ajax({ type: "POST", url: "http://localhost:8888/test.php?", data: "test", success: function(msg){ alert( "Data Saved: " + msg ); } }); alert (c); }); However, the success alert is never shown, nor does this request seem to be even hitting the server. What am I doing wrong? Thanks!

    Read the article

  • All embedded databases fail to open connections

    - by rsteckly
    Hi, I'm working on a winforms desktop application that needs to store data. I made the really bad decision to try and embed a database. I've tried: SQLite VistaDB SQL Server Compact In each case, I was able to generate a Entity Framework Model over the basic schema I've created. I have an event that adds data that I've been using to test these databases. Well, I kept adding a new record using EF and finding it didn't actually insert a record. In debugging, I checked the context object to see what was happening. It turns out that it is saying "the underlying provider failed to open," or something to that effect. It was not throwing an exception, just not inserting a record. The same thing has happened for all 3 embedded databases--prompting me to get it through my dense head that there has to be something wrong with my configuration. Well, I tried to write some basic sql using a sqlconnection and sqlcommand. This time it throws an exception. In the SQL Server Compact case, it now says: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) I thought perhaps a problem was the path in app.Config. So I changed the connection string to: Note that I simplified the path away from anything that might have spaces and avoided using the Data Directory nonsense that causes problem when the debugging directory does not match the preconfigured value for the data directory. I'm running Windows 7; I thought perhaps it might be an access issue--so I tried running VS 2010 in Administrator mode. No luck. I also installed Sql Server Compact SP2, thinking this might be a bug. No luck. Anyway, I'm ready to pull my hair out. I'm on a tight deadline for this thing and didn't expect to spend the day trying to figure out what is going on.

    Read the article

  • Excel Export Issue displaying '#####...'

    - by Cypher
    Hey, I'm trying to just export and excel database into .txt (Tab Delimited) but some of my cells are quite large. When I export into a txt some of the cells are exported as '#######....' which is surprisingly useless. Has this happened to anyone else?/ know an easy fix?

    Read the article

< Previous Page | 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511  | Next Page >