Search Results

Search found 63386 results on 2536 pages for 'data structure'.

Page 588/2536 | < Previous Page | 584 585 586 587 588 589 590 591 592 593 594 595  | Next Page >

  • Parse MIME messages

    - by Abhimanyu
    Hi, For my new project which has email module.i need to show all the email information on web.when i m making a call to server i m getting the base64 encoded mime data. after applying base64 decoding technique i m getting the mime data as follows: /*****************Mime data start *******************************/ From [email protected] Tue Jun 23 12:01:02 2009 Date: Tue, 23 Jun 2009 12:01:02 +0530 From: Prashant R Naik <[email protected]> To: [email protected] Subject: This is a test mail Message-ID: <[email protected]> Reply-To: Prashant R Naik <[email protected]> MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="ReaqsoxgOBHFXBhH" Content-Disposition: inline User-Agent: Mutt/1.5.18 (2008-05-17) Status: RO Content-Length: 1912 Lines: 52 --ReaqsoxgOBHFXBhH Content-Type: text/plain; charset=us-ascii Content-Disposition: inline Test mail. Initiated by prashant Regards, -- Prashant R Naik Principal Technologist | Symbian & Web2.0 Geodesic Limited | www.geodesic.com Tel: +91-80-66551000 --ReaqsoxgOBHFXBhH Content-Type: image/gif Content-Disposition: attachment; filename="trash.gif" Content-Transfer-Encoding: base64 R0lGODlhEAAQANUoADJ8wTqU2DmR1TqV2DN9wTSBxTWFyTaGyTJ9wTWGyTaKzjmS1TOAxTuV 2DaFyTN8wDiN0jiO0jSAxTeKzjqS1DN8wTqR1TWFyjB4vTOBxTmO0TmS1DaKzTeJzTqV1zSA xDJ8wDqS1TeKzTF4vDF4vTiO0f///zuX2gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACH5BAEAACgALAAA AAAQABAAAAaDQNRpSCwWhcakcsk8mZ5Qpik5pUKvT2W1uDVWp+BiYNAImAZmz/lcDoQEFoFp QTFtTPKFQLCAREolJiURJhCCJhqAJRMiIhwmjSYdJgqUjQoODgkJJgecBp0mBgYXBx8ZBQxY UAUSDAUACLEPDwgEAAAEIBUEtygkIyMkwMMYw8EjKEEAOw== --ReaqsoxgOBHFXBhH Content-Type: image/jpeg Content-Disposition: attachment; filename="bx.jpg" Content-Transfer-Encoding: base64 /9j/4AAQSkZJRgABAQEASABIAAD/2wBDAAEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEB AQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQH/2wBDAQEBAQEBAQEBAQEBAQEB AQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQH/wAAR CAAUAAoDAREAAhEBAxEB/8QAFQABAQAAAAAAAAAAAAAAAAAAAAn/xAAYEAEAAwEAAAAAAAAA AAAAAAAAGWen5//EABQBAQAAAAAAAAAAAAAAAAAAAAD/xAAUEQEAAAAAAAAAAAAAAAAAAAAA /9oADAMBAAIRAxEAPwCb4AJHym0Vp3PQJTaK07noJHgA/9k= --ReaqsoxgOBHFXBhH Content-Type: image/png Content-Disposition: attachment; filename="day_bg.png" Content-Transfer-Encoding: base64 iVBORw0KGgoAAAANSUhEUgAAAGQAAAApCAYAAADDJIzmAAAABmJLR0QA/wD/AP+gvaeTAAAA CXBIWXMAAAsTAAALEwEAmpwYAAAAB3RJTUUH2AwCCS0kTriU2QAAAB10RVh0Q29tbWVudABD cmVhdGVkIHdpdGggVGhlIEdJTVDvZCVuAAAAXElEQVR42u3bQQEAMAgDMZiqiZtP5AwbfeQk NO/WvPtLMR0TABEQIAICRECACAgQAREQIAICRECACAgQAREQIAICRECACAgQAREQIAICRECA CAgQARGQ7NpPPasFT+0FZPjBRwYAAAAASUVORK5CYII= --ReaqsoxgOBHFXBhH-- /*****************Mime data end *******************************/ now the problem is i have to parse this data and use it in my application.since this data is not a xml so it difficult to parse it (because parsing with some tag is easy).so any one who knows how to parse mime data help be.i m using erlang to parse this data. Thank you in advance

    Read the article

  • ps forrest for session id

    - by azatoth
    Often I want to get a nice readout what process are running and their relationship; I usually by habit runs ps auxfww and eventual grep for the process in question. Having been thinking about the problem I tried to create an oneliner to get the process tree in ps ufww format for all processes which has the session id specified by arbitrary process name(s); ending up in following code: ps ufww --sid=$(ps -C apache2 -o sess --no-headers | sort | uniq | grep -v -E '^ +0$' | awk 'NR==1{x=$0;next}NF{x=x","$0};END{gsub(/[[:space:]]*/,"",x);print x}') giving for example following output: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 4157 0.0 0.1 41264 3120 ? Ss Jun11 0:00 /usr/sbin/apache2 -k start www-data 4329 0.0 0.0 41264 1976 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 4330 0.0 0.0 41264 2028 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 4331 0.0 0.0 41264 2028 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 4332 0.0 0.0 41264 2028 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 4333 0.0 0.0 41264 2032 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 6648 0.0 0.0 41264 1884 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 6654 0.0 0.0 41264 1884 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start www-data 6655 0.0 0.0 41264 1884 ? S Jun11 0:00 \_ /usr/sbin/apache2 -k start I do wonder now if anyone has an better idea to solve this issue? Are there anything out there that is easier to "oneline" and gives above or better information? For example I would actually want to have included all childs relative any parent. (uncertain if this should be on SF instead, but felt it was more like an programming question)

    Read the article

  • Looking for ideas for a simple pattern matching algorithm to run on a microcontroller

    - by pic_audio
    I'm working on a project to recognize simple audio patterns. I have two data sets, each made up of between 4 and 32 note/duration pairs. One set is predefined, the other is from an incoming data stream. The length of the two strongly correlated data sets is often different, but roughly the same "shape". My goal is to come up with some sort of ranking as to how well the two data sets correlate/match. I have converted the incoming frequencies to pitch and shifted the incoming data stream's pitch so that it's average pitch matches that of the predefined data set. I also stretch/compress the incoming data set's durations to match the overall duration of the predefined set. Here are two graphical examples of data that should be ranked as strongly correlated: http://s2.postimage.org/FVeG0-ee3c23ecc094a55b15e538c3a0d83dd5.gif (Sorry, as a new user I couldn't directly post images) I'm doing this on a 8-bit microcontroller so resources are minimal. Speed is less an issue, a second or two of processing isn't a deal breaker. It wouldn't surprise me if there is an obvious solution, I've just been staring at the problem too long. Any ideas? Thanks in advance...

    Read the article

  • dojox.grid.DataGrid populated from Servlet

    - by jeff porter
    I'd like to hava a Dojo dojox.grid.DataGrid with its data from a servlet. Problem: The data returned from the servlet does not get displayed, just the message "Sorry, an error has occured". If I just place the JSON string into the HTML, it works. ARRRRGGH. Can anyone please help me! Thanks Jeff Porter Servlet code... public void doGet(HttpServletRequest req, HttpServletResponse resp) { res.setContentType("json"); PrintWriter pw = new PrintWriter(res.getOutputStream()); if (response != null) pw.println("[{'batchId':'2001','batchRef':'146'}]"); pw.close(); } HtmL code... <div id="gridDD" dojoType="dojox.grid.DataGrid" jsId="gridDD" style="height: 600x; width: 100%;" store="ddInfo" structure="layoutHtmlTableDDDeltaSets"> </div> var rawdataDDInfo = ""; // empty at start ddInfo = new dojo.data.ItemFileWriteStore({ data: { identifier: 'batchId', label: 'batchId', items: rawdataDDInfo } }); <script> function doSelectBatchsAfterDate() { var xhrArgs = { url: "../secure/jsonServlet", handleAs: "json", preventCache: true, load: function(data) { var xx =dojo.toJson(data); var ddInfoX = new dojo.data.ItemFileWriteStore({data: xx}); dijit.byId('gridDD').setStore(ddInfoX); }, error: function(error) { alert("error:" + error); } } //Call the asynchronous xhrGet var deferred = dojo.xhrGet(xhrArgs); } </script> <img src="go.gif" onclick="doSelectBatchsAfterDate();"/>

    Read the article

  • How do I get google protocol buffer messages over a socket connection without disconnecting the clie

    - by Dan
    Hi there, I'm attempting to send a .proto message from an iPhone application to a Java server via a socket connection. However so far I'm running into an issue when it comes to the server receiving the data; it only seems to process it after the client connection has been terminated. This points to me that the data is getting sent, but the server is keeping its inputstream open and waiting for more data. Would anyone know how I might go about solving this? The current code (or at least the relevant parts) is as follows: iPhone: Person *person = [[[[Person builder] setId:1] setName:@"Bob"] build]; RequestWrapper *request = [[[RequestWrapper builder] setPerson:person] build]; NSData *data = [request data]; AsyncSocket *socket = [[AsyncSocket alloc] initWithDelegate:self]; if (![socket connectToHost:@"192.168.0.6" onPort:6666 error:nil]){ [self updateLabel:@"Problem connecting to socket!"]; } else { [self updateLabel:@"Sending data to server..."]; [socket writeData:data withTimeout:-1 tag:0]; [self updateLabel:@"Data sent, disconnecting"]; //[socket disconnect]; } Java: try { RequestWrapper wrapper = RequestWrapper.parseFrom(socket.getInputStream()); Person person = wrapper.getPerson(); if (person != null) { System.out.println("Persons name is " + person.getName()); socket.close(); } On running this, it seems to hang on the line where the RequestWrapper is processing the inputStream. I did try replacing the socket writedata method with [request writeToOutputStream:[socket getCFWriteStream]]; Which I thought might work, however I get an error claiming that the "Protocol message contained an invalid tag (zero)". I'm fairly certain that it doesn't contain an invalid tag as the message works when sending it via the writedata method. Any help on the matter would be greatly appreciated! Cheers! Dan (EDIT: I should mention, I am using the metasyntactic gpb code; and the cocoaasyncsocket implementation)

    Read the article

  • Read binary file into a struct C#

    - by Robert Höglund
    I'm trying to read binary data using C#. I have all information about the layout of the data in the files I want to read. I'm able to read the data "chunk by chunk", i.e. getting the first 40 bytes of data converting it to a string, get the next 40 bytes, ... Since there are at least three slighlty different version of the data, I would like to read the data directly into a struct. It just feels so much more right than by reading it "line by line". I have tried the following approach but to no avail:StructType aStruct; int count = Marshal.SizeOf(typeof(StructType)); byte[] readBuffer = new byte[count]; BinaryReader reader = new BinaryReader(stream); readBuffer = reader.ReadBytes(count); GCHandle handle = GCHandle.Alloc(readBuffer, GCHandleType.Pinned); aStruct = (StructType) Marshal.PtrToStructure(handle.AddrOfPinnedObject(), typeof(StructType)); handle.Free(); The stream is an opened FileStream from which I have began to read from. I get an AccessViolationException when using Marshal.PtrToStructure. The stream contains more information than I'm trying to read since I'm not interested in data at the end of the file. The struct is defined like:[StructLayout(LayoutKind.Explicit)] struct StructType { [FieldOffset(0)] public string FileDate; [FieldOffset(8)] public string FileTime; [FieldOffset(16)] public int Id1; [FieldOffset(20)] public string Id2; } The examples code is changed from original to make this question shorter. How would I read binary data from a file into a struct?

    Read the article

  • Silverlight: AutoCompleteBox and TextWrapping

    - by Sven Sönnichsen
    How to enable TextWrapping in the AutoCompleteBox control of the SilverlightToolkit (November 2009)? There is no property to set the wrapping mode. So is there any workaround? Sven Here are more infos about my current problem: To me the AutoCompleteBox consists of a list which displays all possible values and a TextBox where I enter a search string and display a selected value. I want now, that the selected value in the TextBox wraps. So here is my current XAML, which uses the AutoCompleteBox in a DataGrid: <data:DataGrid x:Name="GrdComponents" ItemsSource="{Binding Path=Components}" AutoGenerateColumns="false" Margin="4" VerticalAlignment="Stretch" VerticalContentAlignment="Stretch" HorizontalScrollBarVisibility="Visible"> <data:DataGrid.Columns> <data:DataGridTemplateColumn Header="Component" Width="230"> <data:DataGridTemplateColumn.CellEditingTemplate > <DataTemplate> <input:AutoCompleteBox Text="{Binding Component.DataSource, Mode=TwoWay, ValidatesOnExceptions=True, NotifyOnValidationError=True}" Loaded="AcMaterials_Loaded" x:Name="Component" SelectionChanged="AcMaterial_SelectionChanged" IsEnabled="{Binding Component.IsReadOnly, Mode=OneWay, Converter={StaticResource ReadOnlyConverter}}" BindingValidationError="TextBox_BindingValidationError" ToolTipService.ToolTip="{Binding Component.Description}" IsTextCompletionEnabled="False" FilterMode="Contains" MinimumPopulateDelay="1" MinimumPrefixLength="3" ValueMemberPath="Description"> <input:AutoCompleteBox.ItemTemplate> <DataTemplate> <TextBlock Text="{Binding DescriptionTypeNumber}"/> </DataTemplate> </input:AutoCompleteBox.ItemTemplate> </input:AutoCompleteBox> </DataTemplate> </data:DataGridTemplateColumn.CellEditingTemplate> </data:DataGridTemplateColumn> </data:DataGrid.Columns> </data:DataGrid> The AutoCompleteBox uses different values for the list (DescriptionTypeNumer) and for the selected value (Description).

    Read the article

  • high performance hibernate insert

    - by luke
    I am working on a latency sensitive part of an application, basically i will receive a network event transform the data and then insert all the data into the DB. After profiling i see that basically all my time is spent trying to save the data. here is the code private void insertAllData(Collection<Data> dataItems) { long start_time = System.currentTimeMillis(); long save_time = 0; long commit_time = 0; Transaction tx = null; try { Session s = HibernateSessionFactory.getSession(); s.setCacheMode(CacheMode.IGNORE); s.setFlushMode(FlushMode.NEVER); tx = s.beginTransaction(); for(Data data : dataItems) { s.saveOrUpdate(data); } save_time = System.currentTimeMillis(); tx.commit(); s.flush(); s.clear(); } catch(HibernateException ex) { if(tx != null) tx.rollback(); } commit_time = System.currentTimeMillis(); System.out.println("Save: " + (save_time - start_time)); System.out.println("Commit: " + (commit_time - save_time)); System.out.println(); } The size of the collection is always less than 20. here is the timing data that i see: Save: 27 Commit: 9 Save: 27 Commit: 9 Save: 26 Commit: 9 Save: 36 Commit: 9 Save: 44 Commit: 0 This is confusing to me. I figure that the save should be quick and all the time should be spent on commit. but clearly I'm wrong. I have also tried removing the transaction (its not really necessary) but i saw worse times... I have set hibernate.jdbc.batch_size=20... i need this operation to be as fast as possible, ideally there would only be one roundtrip to the database. How can i do this?

    Read the article

  • Process results of conditional split in SSIS

    - by Robert
    I have a Data Flow Task and am connecting to a database via an OLE DB Source component to extract data. This data feeds into a Conditional Split component to separate the data based on a simple expression. After the evaluation of this expression, the data will end up in either of two locations: LocationA or LocationB. Alright, I have that all set up and working properly. Once the data is separated into these two locations, additional processing is to be done on the records. Here's where I am stuck: I need the the processing of records in LocationA to occur before the processing of records in LocationB. Is there a way to set precedence of which tasks occur before others? If not, what is the best way to handle this? I was thinking I may need to write the data in LocationA and LocationB back out to the database and create a new data flow task in the control flow to handle the order of which these records must be dealt with. Any help is greatly appreciated!

    Read the article

  • Copy Constructors and calling functions

    - by helixed
    Hello, I'm trying to call an accessor function in a copy constructor but it's not working. Here's an example of my problem: A.h class A { public: //Constructor A(int d); //Copy Constructor A(const A &rhs); //accessor for data int getData(); //mutator for data void setData(int d); private: int data; }; A.cpp #include "A.h" //Constructor A::A(int d) { this->setData(d); } //Copy Constructor A::A(const A &rhs) { this->setData(rhs.getData()); } //accessor for data int A::getData() { return data; } //mutator for data void A::setData(int d) { data = d; } When I try to compile this, I get the following error: error: passing 'const A' as 'this' argument of 'int A::getData()' discards qualifiers If I change rhs.getData() to rhs.data, then the constructor works fine. Am I not allowed to call functions in a copy constructor? Could somebody please tell me what I'm doing wrong? Thanks, helixed

    Read the article

  • jQuery's getScript and the local file system-- limitations/alternatives?

    - by user210099
    Right now I'm working on a help-system which is based on a local file system. It is intended to be shipped with a product which is not used on internet-enabled machines, so it must be a stand alone webpage, without any dependencies on a web server. This introduces a few challenges. Namely, the directory structure that the files exist in require navigating "up and over" to access some .js files which are required to display the help system. This use to be implemented using the jQuery getScript function, but I have ran into some problems using this on the local file system. At first glance, it seemed that if my webpage was being served out of the C:/dev/webpage/html/ directory, and the files I needed were in C:/dev/webpage/js/(topic)/file.js, I could just build an absolute path (file:///...) and pass that into the getScript function. However, after testing this, it does not seem that the getScript function will let me go up a level from the html directory (where the html file is located which has the main code for the webpage). Unfortunately, I can not change the directory structure, nor can I change the .js file structure/format. Is there an alternative for loading/executing javascript files that are in a file structure where I need to go "up and over"? Thanks,

    Read the article

  • jQuery autopopulate select drop down from JSON issues

    - by Jonathon Joyce
    I have an issue regarding to auto populating a select dropdown from jQuery/JSON data which is return from a ColdFusion CFC, the code is below: $(function(){ $("#licences-add").dialog({autoOpen:false,modal:true,title:'Add Licences',height:250,width:380}); }); function openAddLicence(intInstanceID,szName,szDatasourceName){ $.getJSON('/ASPAdmin/billing/handler.cfc?method=ListLicenceTypes&queryformat=column',{szInstanceDatasource:szDatasourceName}, function(data){ $.each(data,function(){ $('<option></option>').val(data.DATA.UUIDLICENCETYPE).text(data.DATA.SZLICENCETYPE).appendTo('#uuidLicenceType'); }); }); $("#intInstanceID").attr('value', intInstanceID); $('span#szInstanceName').text(szName); $("#licences-add").dialog('open');}; The json returned is: {"ROWCOUNT":1,"COLUMNS":["UUIDLICENCETYPE","SZLICENCETYPE"],"DATA":{"UUIDLICENCETYPE":["480CE560-BCD3-C7AC-AF50B3C71BBCC473"],"SZLICENCETYPE":["Standard"]}} However i get the following error: $("").val(this.UUIDLICENCETYPE).text is not a function Any ideas? HTML: <tr> <td><label for="uuidLicenceType" title="Select the licence type (required).">Licence Type</label> <span class="req">*</span></td> <td> <select name="uuidLicenceType" id="uuidLicenceType" class="bmSelect"> <option value=""></option> </select> </td> </tr>

    Read the article

  • multithreading in c#

    - by Lalit Dhake
    Hi, I have console application. In that i have some process that fetch the data from database through different layers ( business and Data access). stores the fetched data in respective objects. Like if data is fetched for student then this data will store (assigned ) to Student object. same for school. and them a delegate call the certain method that generates outputs as per requirement. This process will execute many times say 10 times. Ok? I want to run simultaneously this process. not one will start, it will finish and then second will start. I want after starting 1'st process, just 2'nd , 3rd....10'th must be start. Means it should be multithreading. how can i achieve this ? is that will give me error while connection with data base open and close ? I have tried this concept . but when thread 1'st is starting then data will fetched for thread 1 will stored in its respective (student , school) objects. ok? when simultaneous 2'nd thread starts , but the data is changing of 1'st object ,while control flowing in program. What have to do?

    Read the article

  • Versioning friendly, extendible binary file format

    - by Bas Bossink
    In the project I'm currently working on there is a need to save a sizable data structure to disk (edit: think dozens of MB's). Being an optimist, I thought that there must be a standard solution for such a problem; however, up to now I haven't found a solution that satisfies the following requirements: .NET 2.0 support, preferably with a FOSS implementation Version friendly (this should be interpreted as: reading an old version of the format should be relatively simple if the changes in the underlying data structure are simple, say adding/dropping fields) Ability to do some form of random access where part of the data can be extended after initial creation (think of this as extending intermediate results) Space and time efficient (XML has been excluded as option given this requirement) Options considered so far: Protocol Buffers: was turned down by verdict of the documentation about Large Data Sets - since this comment suggested adding another layer on top, this would call for additional complexity which I wish to have handled by the file format itself. HDF5,EXI: do not seem to have .net implementations SQLite/SQL Server Compact edition: the data structure at hand would result in a pretty complex table structure that seems too heavyweight for the intended use BSON: does not appear to support requirement 3. Fast Infoset: only seems to have paid .NET implementations. Any recommendations or pointers are greatly appreciated. Furthermore if you believe any of the information above is not true, please provide pointers/examples to prove me wrong.

    Read the article

  • Umbraco XSLT issue

    - by Brad
    I'm trying to use the Umbraco GetMedia function to write an image URL to the screen, I receive an error parsing the XSLT file. <xsl:for-each select="$currentPage/descendant::node [string(data [@alias='tickerItemActive']) = '1']"> <xsl:value-of select="data [@alias='tickerText']"/><br /> <xsl:value-of select="umbraco.library:GetMedia(data [@alias='tickerImage'], 0)/data [@alias = 'umbracoFile']"/> </xsl:for-each> The tickerImage field contains the MediaID for which I'd like to display the URL. I can return the field outside the GetMedia function and it works fine. I can also replace the data [@alias='tickerImage] with '1117' (or any valid media ID) the XSLT passes verification and the script runs. THIS WORKS: <xsl:value-of select="umbraco.library:GetMedia('1117', 0)/data [@alias = 'umbracoFile']"/> THIS DOES NOT: <xsl:value-of select="umbraco.library:GetMedia(data [@alias='tickerImage'], 0)/data [@alias = 'umbracoFile']"/> Any help that can be offered would is appreciated. Thanks!

    Read the article

  • Linq Query to Update Nested Array Items?

    - by Brett
    I have an object structure generated from xsd.exe. Roughly, it consists of 3 nested arrays: protocols, sources and reports. The xml looks like this: <protocols> <protocol> <source> <report /> <report /> </source> <source> <report /> <report /> </source> </protocol> <!-- more protocols --> </protocols> I need to update a single "Report" within the data structure. A brute force algorithm is shown below. I know that this could be done using XDocument and Linq, but I'd prefer to update the data structure and then serialize the structure back to disk. Thoughts? Brett bool updated = false; foreach (ProtocolsProtocol protocol in protocols.Protocol) { if (updated) break; foreach (ProtocolsProtocolSource source in protocol.Source) { if (updated) break; for (int i = 0; i < source.Report.Length; i++) { ProtocolsProtocolSourceReport currentReport = source.Report[i]; if (currentReport.Id == report.Id) { currentReport.Attribute1 = report.Attribute1; currentReport.Attribute2 = report.Attribute2; updated = true; break; } } } }

    Read the article

  • how to do introspection in R (stat package)

    - by Lebron James
    Hi all, I am somewhat new to R, and i have this piece of code which generates a variable that i don't know the type for. Are there any introspection facility in R which will tell me which type this variable belongs to? The following illustrates the property of this variable: I am working on linear model selection, and the resource I have is lm result from another model. Now I want to retrieve the lm call by the command summary(model)$call so that I don't need to hardcode the model structure. However, since I have to change the dataset, I need to do a bit of modification on the "string", but apparently it is not a simple string. I wonder if there is any command similar to string.replace so that I can manipulate this variable from the variable $call. Thanks > str<-summary(rdnM)$call > str lm(formula = y ~ x1, data = rdndat) > str[1] lm() > str[2] y ~ x1() > str[3] rdndat() > str[3] <- data Warning message: In str[3] <- data : number of items to replace is not a multiple of replacement length > str lm(formula = y ~ x1, data = c(10, 20, 30, 40)) > str<-summary(rdnM)$call > str lm(formula = y ~ x1, data = rdndat) > str[3] <- 'data' > str lm(formula = y ~ x1, data = "data") > str<-summary(rdnM)$call > type str Error: unexpected symbol in "type str" >

    Read the article

  • Efficient alternative to merge() when building dataframe from json files with R?

    - by Bryan
    I have written the following code which works, but is painfully slow once I start executing it over thousands of records: require("RJSONIO") people_data <- data.frame(person_id=numeric(0)) json_data <- fromJSON(json_file) n_people <- length(json_data) for(lender in 1:n_people) { person_dataframe <- as.data.frame(t(unlist(json_data[[person]]))) people_data <- merge(people_data, person_dataframe, all=TRUE) } output_file <- paste("people_data",".csv") write.csv(people_data, file=output_file) I am attempting to build a unified data table from a series of json-formated files. The fromJSON() function reads in the data as lists of lists. Each element of the list is a person, which then contains a list of the attributes for that person. For example: [[1]] person_id name gender hair_color [[2]] person_id name location gender height [[...]] structure(list(person_id = "Amy123", name = "Amy", gender = "F", hair_color = "brown"), .Names = c("person_id", "name", "gender", "hair_color")) structure(list(person_id = "matt53", name = "Matt", location = structure(c(47231, "IN"), .Names = c("zip_code", "state")), gender = "M", height = 172), .Names = c("person_id", "name", "location", "gender", "height")) The end result of the code above is matrix where the columns are every person-attribute that appears in the structure above, and the rows are the relevant values for each person. As you can see though, some data is missing for some of the people, so I need to ensure those show up as NA and make sure things end up in the right columns. Further, location itself is a vector with two components: state and zip_code, meaning it needs to be flattened to location.state and location.zip_code before it can be merged with another person record; this is what I use unlist() for. I then keep the running master table in people_data. The above code works, but do you know of a more efficient way to accomplish what I'm trying to do? It appears the merge() is slowing this to a crawl... I have hundreds of files with hundreds of people in each file. Thanks! Bryan

    Read the article

  • ADO.NET Batch Insert with over 2000 parameters

    - by Liming
    Hello all, I'm using Enterprise library, but the idea is the same. I have a SqlStringCommand and the sql is constructed using StringBuilder in the forms of "insert into table (column1, column2, column3) values (@param1-X, @param2-X, @parm3-X)"+" " where "X" represents a "for loop" about 700 rows StringBuilder sb = new StringBuilder(); for(int i=0; i<700; i++) { sb.Append("insert into table (column1, column2, column3) values (@param1-"+i+", @param2-"+i, +",@parm3-"+i+") " ); } followed by constructing a command object injecting all the parameters w/ values into it. Essentially, 700 rows with 3 parameters, I ended up with 2100 parameters for this "one sql" Statement. It ran fine for about a few days and suddenly I got this error =============================================================== A severe error occurred on the current command. The results, if any, should be discarded. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) at System.Data.SqlClient.SqlCommand.InternalExecuteNon Any pointers are greatly appreciated.

    Read the article

  • JAXB - Beans to XSD or XSD to beans?

    - by bajafresh4life
    I have an existing data model. I would like to express this data model in terms of XML. It looks like I have two options if I'm to use JAXB: Create an XSD that mirrors my data model, and use xjc to create binding objects. Marshalling and unmarshalling will involve creating a "mapping" class that would take my existing data objects and map them to the objects that xjc created. For example, in my data model I have a Doc class, and JAXB would create another Doc class with basically the same exact fields, and I would have to map from my Doc class to xjc's Doc class. Annotate my existing data model with JAXB annotations, and use schemagen to generate an XSD from my annotated classes. I can see advantanges and disadvantages of both approaches. It seems that most people using JAXB start with the XSD file. It makes sense that the XSD should be the gold standard truth, since it expresses the data model in a truly cross-platform way. I'm inclined to start with the XSD first, but it seems icky that I have to write and maintain a separate mapping class that shuttles data in between my world and JAXB world. Any recommendations?

    Read the article

  • funny behavior of jquery code

    - by user253530
    Funny thing is that if i delete the comment for alert(data[i].id) the code works. As it is in the example, the string is not concatenated thus i have no options in the select box. Hints? Help? var bookmarkingSites = ''; $.getJSON("php/socialbookmark-get-bookmarking-sites.php",function(data){ for(var i = 0; i < data.length; i++){ //alert( data[i].id); bookmarkingSites += '<option value = \"' + data[i].id + '\">' + data[i].title + '</option>'; } }); <some more code> -------> toAppend += '<td><select name="sb2" id="sb2">'+ '<option value="'+ data.results[i].bookmark +'">' + data.results[i].bookmark +'</option>' + bookmarkingSites + '</select></td>'; <some more code>

    Read the article

  • Suggestions on how build an HTML Diff tool?

    - by Danimal
    In this post I asked if there were any tools that compare the structure (not actual content) of 2 HTML pages. I ask because I receive HTML templates from our designers, and frequently miss minor formatting changes in my implementation. I then waste a few hours of designer time sifting through my pages to find my mistakes. The thread offered some good suggestions, but there was nothing that fit the bill. "Fine, then", thought I, "I'll just crank one out myself. I'm a halfway-decent developer, right?". Well, once I started to think about it, I couldn't quite figure out how to go about it. I can crank out a data-driven website easily enough, or do a CMS implementation, or throw documents in and out of BizTalk all day. Can't begin to figure out how to compare HTML docs. Well, sure, I have to read the DOM, and iterate through the nodes. I have to map the structure to some data structure (how??), and then compare them (how??). It's a development task like none I've ever attempted. So now that I've identified a weakness in my knowledge, I'm even more challenged to figure this out. Any suggestions on how to get started? clarification: the actual content isn't what I want to compare -- the creative guys fill their pages with lorem ipsum, and I use real content. Instead, I want to compare structure: <div class="foo">lorem ipsum<div> is different that <div class="foo"><p>lorem ipsum<p><div>

    Read the article

  • Redirecting users on select from autocomplete?

    - by juno-2
    Hi, i'm trying to implement the jquery autocomplete plugin. I've got it up and running, but something is not working properly. Basically i have a autocomplete-list of employees. The list is generated from a table in a sql-database (employee_names and employee_ID), using a VB.NET handler (.ashx file). The data is formatted as: employee_name-employee_ID. So far so good and all employees are listed in autocomplete. The problem is that i don't know how to redirect a user to a certain page (for example employee_profile.aspx) when they've selected an employee from autocomplete. This is my redirect-code, but it ain't working like it should: $('#fname2').result(function(event, data, formatted) { location.href = "employee_profile.aspx?id=" + data }); For example; a user selects It will redirect a user to employee_profile.aspx?id=name of employee-id of employee (for example: employee_profile.aspx?id=John Doe-91210) instead of employee_profile.aspx?id=91210. I know i can strip the employee_ID with: formatResult: function(data, value) { return value.split("-")[1]; } }); But i do not know how to pass that employee_ID to the redirect-page.. Here my whole code: $().ready(function() { $("#fname2").autocomplete("AutocompleteData.ashx", { minChars: 3, selectFirst: false, formatItem: function(data, i, n, value) { return value.split("-")[0]; }, //Not used, just for splitting employee_ID //formatResult: function(data, value) { // return value.split("-")[1]; //} }); $('#fname2').result(function(event, data, formatted) { location.href = "employee_profile.aspx?id=" + data }); }); I know i'm very close and it should be something really simple, but can anyone help me out?

    Read the article

  • Thread implemented as a Singleton

    - by rocknroll
    Hi all, I have a commercial application made with C,C++/Qt on Linux platform. The app collects data from different sensors and displays them on GUI. Each of the protocol for interfacing with sensors is implemented using singleton pattern and threads from Qt QThreads class. All the protocols except one work fine. Each protocol's run function for thread has following structure: void <ProtocolClassName>::run() { while(!mStop) //check whether screen is closed or not { mutex.lock() while(!waitcondition.wait(&mutex,5)) { if(mStop) return; } //Code for receiving and processing incoming data mutex.unlock(); } //end while } Hierarchy of GUI. 1.Login screen. 2. Screen of action. When a user logs in from login screen, we enter the action screen where all data is displayed and all the thread's for different sensors start. They wait on mStop variable in idle time and when data arrives they jump to receiving and processing data. Incoming data for the problem protocol is 117 bytes. In the main GUI threads there are timers which when timeout, grab the running instance of protocol using <ProtocolName>::instance() function Check the update variable of singleton class if its true and display the data. When the data display is done they reset the update variable in singleton class to false. The problematic protocol has the update time of 1 sec, which is also the frame rate of protocol. When I comment out the display function it runs fine. But when display is activated the application hangs consistently after 6-7 hours. I have asked this question on many forums but haven't received any worthwhile suggestions. I Hope that here I will get some help. Also, I have read a lot of literature on Singleton, multithreading, and found that people always discourage the use of singletons especially in C++. But in my application I can think of no other design for implementation. Thanks in advance A Hapless programmer

    Read the article

  • Unknown error when submit a REST request to Liferay json API

    - by r.rodriguez
    I'm writing an script in Python to automatically update the structures in my Liferay portal and I want to do it via the json REST API. I make a request to get an structure (method getStructure), and it worked. But when I try to do an structure update in the portal it shows me the following error: ValueError: Content-Length should be specified for iterable data of type class 'dict' {'serviceContext': "{'prueba'}", 'serviceClassName': 'com.liferay.portlet.journal.service.JournalStructureServiceUtil', 'name': 'FOO', 'xsd': '... THE XSD OBTAINED VIA JSON ...', 'serviceParameters': '[groupId,structureId,parentStructureId,name,description,xsd,serviceContext]', 'description': 'FOO Structure', 'serviceMethodName': 'updateStructure', 'groupId': '10133'} What I'm doing is the next: urllib.request.Request(url = URL, data = data_update, headers = headers) URL is http://localhost:8080/tunnel-web/secure/json The headers are configured with basic authentication (it works, it is tested with the getStructure method). Data is: data_update = { "serviceClassName" : "com.liferay.portlet.journal.service.JournalStructureServiceUtil", "serviceMethodName" : "updateStructure", "serviceParameters" : "[groupId,structureId,parentStructureId,name,description,xsd,serviceContext]", "groupId" : 10133, "name" : FOO, "description" : FOO Structure, "xsd" : ... THE XSD OBTAINED VIA JSON ..., "serviceContext" : "{}" } Does anybody know the solution? Have I to specify the length for the dictionary and how? Or this is a bug?

    Read the article

< Previous Page | 584 585 586 587 588 589 590 591 592 593 594 595  | Next Page >