Search Results

Search found 16189 results on 648 pages for 'document conversion'.

Page 146/648 | < Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >

  • Loading a new instance of a class through XML not working quite right

    - by Thegluestickman
    I'm having trouble with XML and XNA. I want to be able to load weapon settings through XML to make my weapons easier to make and to have less code in the actual project file. So I started out making a basic XML document, something to just assign variables with. But no matter what I changed it gave me a new error every time. The code below gives me a "XML element 'Tag' not found", I added and it started to say the variables weren't found. What I wanted to do in the XML file as well, was load a texture for the file too. So I created a static class to hold my texture values, then in the Texture tag of my XML document I would set it to that instance too. I think that's were the problems are occuring because that's where the "XML element 'Tag' not found" error is pointing me too. My XML document: <XnaContent> <Asset Type="ConversationEngine.Weapon"> <weaponStrength>0</weaponStrength> <damageModifiers>0</damageModifiers> <speed>0</speed> <magicDefense>0</magicDefense> <description>0</description> <identifier>0</identifier> <weaponTexture>LoadWeaponTextures.ironSword</weaponTexture> </Asset> </XnaContent> My Class to load the weapon XML: public static class LoadWeaponXML { static Weapon Weapons; public static Weapon WeaponLoad(ContentManager content, int id) { Weapons = content.Load<Weapon>(@"Weapons/" + id); return Weapons; } } public static class LoadWeaponTextures { public static Texture2D ironSword; public static void TextureLoad(ContentManager content) { ironSword = content.Load<Texture2D>("Sword"); } } I'm not entirely sure if you can load textures through XML, but any help would be greatly appreciated.

    Read the article

  • Google analytics iframe code measuring visitor as two visitors

    - by Maarten
    I'm trying to measure visitors in an iframe and the site containing the iframe. What I would like is that visitors clicks in the iframe are seen being from the same visitor as the containing site, but somehow it is seen as two seperate visitors. I followed examples from http://www.blastam.com/blog/index.php/2011/02/google-analytics-cross-domain-tracking/, trimmed down to an even simpler version based on the comments about setDomainName not being needed anymore but with setDomainName I get the same result: a click on a page and a click on the iframe is seen as 2 clicks by 2 seperate visitors. This is the code in my iframe if (_gaq && gaAccount.length > 0){ _gaq.push(['_setAccount', gaAccount]); _gaq.push(['_setAllowLinker', true]); //_gaq.push(['_setDomainName', 'none']); _gaq.push(['_trackPageview', 'mytestcountername']); } And this is the code in the containing page: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-9605474-4']); _gaq.push(['_setAllowLinker', true]); //_gaq.push(['_setDomainName', '.domain.nl']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script>

    Read the article

  • Acrobat 9.3.2 printing hidden fields automatically

    - by Noah
    We have a few clients internally running Acrobat Standard 9.0.0 and their documents are printing fine. One user upgraded to 9.3.2, and now when they try and print some of our documents, a hidden field area is automatically printing. I can't seem to find a way to turn it off. It doesn't appear in the document, or in print preview. Choosing to Examing the document and remove it removes the text, but not the spacing that the hidden area added. is there a setting to never print this? It's not something we want to have to adjust each time we open a document.

    Read the article

  • Aligning Numbered Bullet Points in Word 2007

    - by FrustratedwithWord
    I am putting together a very large business manual which incorporates numbered headings, steps to follow, diagrams, etc. When using the bullet points, they align perfectly as I work through the processes. However when I include a diagram, or something different from the "norm" of text, the alignment changes. I would like all the bullet points to be aligned in the whole document regardless of where they appear in the document. Is there a way to save the settings so that the bullets always appear in the same position? Currently I am having to reset the indents by dragging the tabs on the ruler. This will be a large document, so I don't want to manually adjust the numbered bullets every time. Help would be greatly appreciated. Thanks very much.

    Read the article

  • ASP.NET WebAPI Security 4: Examples for various Authentication Scenarios

    - by Your DisplayName here!
    The Thinktecture.IdentityModel.Http repository includes a number of samples for the various authentication scenarios. All the clients follow a basic pattern: Acquire client credential (a single token, multiple tokens, username/password). Call Service. The service simply enumerates the claims it finds on the request and returns them to the client. I won’t show that part of the code, but rather focus on the step 1 and 2. Basic Authentication This is the most basic (pun inteneded) scenario. My library contains a class that can create the Basic Authentication header value. Simply set username and password and you are good to go. var client = new HttpClient { BaseAddress = _baseAddress }; client.DefaultRequestHeaders.Authorization = new BasicAuthenticationHeaderValue("alice", "alice"); var response = client.GetAsync("identity").Result; response.EnsureSuccessStatusCode();   SAML Authentication To integrate a Web API with an existing enterprise identity provider like ADFS, you can use SAML tokens. This is certainly not the most efficient way of calling a “lightweight service” ;) But very useful if that’s what it takes to get the job done. private static string GetIdentityToken() {     var factory = new WSTrustChannelFactory(         new WindowsWSTrustBinding(SecurityMode.Transport),         _idpEndpoint);     factory.TrustVersion = TrustVersion.WSTrust13;     var rst = new RequestSecurityToken     {         RequestType = RequestTypes.Issue,         KeyType = KeyTypes.Bearer,         AppliesTo = new EndpointAddress(Constants.Realm)     };     var token = factory.CreateChannel().Issue(rst) as GenericXmlSecurityToken;     return token.TokenXml.OuterXml; } private static Identity CallService(string saml) {     var client = new HttpClient { BaseAddress = _baseAddress };     client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("SAML", saml);     var response = client.GetAsync("identity").Result;     response.EnsureSuccessStatusCode();     return response.Content.ReadAsAsync<Identity>().Result; }   SAML to SWT conversion using the Azure Access Control Service Another possible options for integrating SAML based identity providers is to use an intermediary service that allows converting the SAML token to the more compact SWT (Simple Web Token) format. This way you only need to roundtrip the SAML once and can use the SWT afterwards. The code for the conversion uses the ACS OAuth2 endpoint. The OAuth2Client class is part of my library. private static string GetServiceTokenOAuth2(string samlToken) {     var client = new OAuth2Client(_acsOAuth2Endpoint);     return client.RequestAccessTokenAssertion(         samlToken,         SecurityTokenTypes.Saml2TokenProfile11,         Constants.Realm).AccessToken; }   SWT Authentication When you have an identity provider that directly supports a (simple) web token, you can acquire the token directly without the conversion step. Thinktecture.IdentityServer e.g. supports the OAuth2 resource owner credential profile to issue SWT tokens. private static string GetIdentityToken() {     var client = new OAuth2Client(_oauth2Address);     var response = client.RequestAccessTokenUserName("bob", "abc!123", Constants.Realm);     return response.AccessToken; } private static Identity CallService(string swt) {     var client = new HttpClient { BaseAddress = _baseAddress };     client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", swt);     var response = client.GetAsync("identity").Result;     response.EnsureSuccessStatusCode();     return response.Content.ReadAsAsync<Identity>().Result; }   So you can see that it’s pretty straightforward to implement various authentication scenarios using WebAPI and my authentication library. Stay tuned for more client samples!

    Read the article

  • Word 2007 Question

    - by Lijo
    Hi Team, While preparing a Word 2007 document, I made a mistake. (Not to say I don't have any other copy of the document) While formatting (as a try) I applied the style "Apply Style to Body to match selection". This caused the document to go totally in a wronfg format - having numbers even in tables. Have you ever faced this? Could you please tell how to correct it? Hope you would be kind enough to answer this even though it is not striclty technical. Thanks Lijo

    Read the article

  • Working with packed dates in SSIS

    - by Jim Giercyk
    One of the challenges recently thrown my way was to read an EBCDIC flat file, decode packed dates, and insert the dates into a SQL table.  For those unfamiliar with packed data, it is a way to store data at the nibble level (half a byte), and was often used by mainframe programmers to conserve storage space.  In the case of my input file, the dates were 2 bytes long and  represented the number of days that have past since 01/01/1950.  My first thought was, in the words of Scooby, Hmmmmph?  But, I love a good challenge, so I dove in. Reading in the flat file was rather simple.  The only difference between reading an EBCDIC and an ASCII file is the Code Page option in the connection manager.  In my case, I needed to use Code Page 1140 for EBCDIC (I could have also used Code Page 37).       Once the code page is set correctly, SSIS can understand what it is reading and it will convert the output to the default code page, 1252.  However, packed data is either unreadable or produces non-alphabetic characters, as we can see in the preview window.   Column 1 is actually the packed date, columns 0 and 2 are the values in the rest of the file.  We are only interested in Column 1, which is a 2 byte field representing a packed date.  We know that 2 bytes of packed data can be stored in 1 byte of character data, so we are working with 4 packed digits in 2 character bytes.  If you are confused, stay tuned….this will make sense in a minute.   Right-click on your Flat File Source shape and select “Show Advanced Editor”. Here is where the magic begins. By changing the properties of the output columns, we can access the packed digits from each byte. By default, the Output Column data type is DT_STR. Since we want to look at the bytes individually and not the entire string, change the data type to DT_BYTES. Next, and most important, set UseBinaryFormat to TRUE. This will write the HEX VALUES of the output string instead of writing the character values.  Now we are getting somewhere! Next, you will need to use a Data Conversion shape in your Data Flow to transform the 2 position byte stream to a 4 position Unicode string containing the packed data.  You need the string to be 4 bytes long because it will contain the 4 packed digits.  Here is what that should look like in the Data Conversion shape: Direct the output of your data flow to a test table or file to see the results.  In my case, I created a test table.  The results looked like this:     Hold on a second!  That doesn't look like a date at all.  No, of course not.  It is a hex number which represents the days which have passed between 01/01/1950 and the date.  We have to convert the Hex value to a decimal value, and use the DATEADD function to get a date value.  Luckily, I have created a function to convert Hex to Decimal:   -- ============================================= -- Author:        Jim Giercyk -- Create date: March, 2012 -- Description:    Converts a Hex string to a decimal value -- ============================================= CREATE FUNCTION [dbo].[ftn_HexToDec] (     @hexValue NVARCHAR(6) ) RETURNS DECIMAL AS BEGIN     -- Declare the return variable here DECLARE @decValue DECIMAL IF @hexValue LIKE '0x%' SET @hexValue = SUBSTRING(@hexValue,3,4) DECLARE @decTab TABLE ( decPos1 VARCHAR(2), decPos2 VARCHAR(2), decPos3 VARCHAR(2), decPos4 VARCHAR(2) ) DECLARE @pos1 VARCHAR(1) = SUBSTRING(@hexValue,1,1) DECLARE @pos2 VARCHAR(1) = SUBSTRING(@hexValue,2,1) DECLARE @pos3 VARCHAR(1) = SUBSTRING(@hexValue,3,1) DECLARE @pos4 VARCHAR(1) = SUBSTRING(@hexValue,4,1) INSERT @decTab VALUES (CASE               WHEN @pos1 = 'A' THEN '10'                 WHEN @pos1 = 'B' THEN '11'               WHEN @pos1 = 'C' THEN '12'               WHEN @pos1 = 'D' THEN '13'               WHEN @pos1 = 'E' THEN '14'               WHEN @pos1 = 'F' THEN '15'               ELSE @pos1              END, CASE               WHEN @pos2 = 'A' THEN '10'                 WHEN @pos2 = 'B' THEN '11'               WHEN @pos2 = 'C' THEN '12'               WHEN @pos2 = 'D' THEN '13'               WHEN @pos2 = 'E' THEN '14'               WHEN @pos2 = 'F' THEN '15'               ELSE @pos2              END, CASE               WHEN @pos3 = 'A' THEN '10'                 WHEN @pos3 = 'B' THEN '11'               WHEN @pos3 = 'C' THEN '12'               WHEN @pos3 = 'D' THEN '13'               WHEN @pos3 = 'E' THEN '14'               WHEN @pos3 = 'F' THEN '15'               ELSE @pos3              END, CASE               WHEN @pos4 = 'A' THEN '10'                 WHEN @pos4 = 'B' THEN '11'               WHEN @pos4 = 'C' THEN '12'               WHEN @pos4 = 'D' THEN '13'               WHEN @pos4 = 'E' THEN '14'               WHEN @pos4 = 'F' THEN '15'               ELSE @pos4              END) SET @decValue = (CONVERT(INT,(SELECT decPos4 FROM @decTab)))         +                 (CONVERT(INT,(SELECT decPos3 FROM @decTab))*16)      +                 (CONVERT(INT,(SELECT decPos2 FROM @decTab))*(16*16)) +                 (CONVERT(INT,(SELECT decPos1 FROM @decTab))*(16*16*16))     RETURN @decValue END GO     Making use of the function, I found the decimal conversion, added that number of days to 01/01/1950 and FINALLY arrived at my “unpacked relative date”.  Here is the query I used to retrieve the formatted date, and the result set which was returned: SELECT [packedDate] AS 'Hex Value',        dbo.ftn_HexToDec([packedDate]) AS 'Decimal Value',        CONVERT(DATE,DATEADD(day,dbo.ftn_HexToDec([packedDate]),'01/01/1950'),101) AS 'Relative String Date'   FROM [dbo].[Output Table]         This technique can be used any time you need to retrieve the hex value of a character string in SSIS.  The date example may be a bit difficult to understand at first, but with SSIS becoming the preferred tool for enterprise level integration for many companies, there is no doubt that developers will encounter these types of requirements with regularity in the future. Please feel free to contact me if you have any questions.

    Read the article

  • Google Analytics async=true seems wrong in the Google documentation?

    - by leeand00
    In the Google Analytics async example, they state that in order to include more than one tracker, you need to setup your pages for asyncrous tracking, and they do so using the following code: <script type="text/javascript"> _gaq.push( ['_setAccount', 'UA-XXXXX-1'], ['_trackPageview'], ['b._setAccount', 'UA-XXXXX-2'], ['b._trackPageview'] ); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The second tracker is not receiving any results. After looking at my tracking codes to ensure they are correct, I noticed that the ga.async = true statement is specified differently most of the time and is never set to a value of true, it's often set to async but never true. Could this be stopping my Analytics data from posting to the second tracker? or might it be something else? Also what calls should I look for in the Net tab in firebug to ensure that GA is being called when the page loads?

    Read the article

  • Using HTML5 Today part 2&ndash;Fixing Semantic tags with a Shiv

    - by Steve Albers
    Semantic elements and the Shiv! This is the second entry in the series of demos from the “Using HTML5 Today” talk. For the definitive discussion on unknown elements and the HTML5 Shiv check out Mark Pilgrim’s Dive Into HTML5 online book at http://diveintohtml5.info/semantics.html#unknown-elements Semantic tags increase the meaning and maintainability of your markup, help make your page more computer-readable, and can even provide opportunities for libraries that are written to automagically enhance content using standard tags like <nav>, <header>,  or <footer>. Legacy IE issues However, new HTML5 tags get mangled in IE browsers prior to version 9.  To see this in action, consider this bit of HTML code which includes the new <article> and <header> elements: Viewing this page using the IE9 developer tools (F12) we see that the browser correctly models the hierarchy of tags listed above: But if we switch to IE8 Browser Mode in developer tools things go bad: Did you know that a closing tag could close itself?? The browser loses the hierarchy & closes all of the new tags.  The new tags become unusable and the page structure falls apart. Additionally block-level elements lose their block status, appearing as inline.    The Fix (good) The block-level issue can be resolved by using CSS styling.  Below we set the article, header, and footer tags as block tags. article, header, footer {display:block;} You can avoid the unknown element issue by creating a version of the element in JavaScript before the actual HTML5 tag appears on the page: <script> document.createElement("article"); document.createElement("header"); document.createElement("footer"); </script> The Fix (better) Rather than adding your own JS you can take advantage of a standard JS library such as Remy Sharp’s HTML5 Shiv at http://code.google.com/p/html5shiv/.  By default the Modernizr library includes HTML5 Shiv, so you don’t need to include the shiv code separately if you are using Modernizr.

    Read the article

  • Oracle E-Business Suite Release 12.1 Certified on Solaris 11

    - by John Abraham
    Oracle Solaris 11 was announced last week, and I'm pleased to also announce that Oracle E-Business Suite Release 12.1 is now certified on Oracle Solaris on SPARC (64-bit). This new operating system release represents a culmination of years of hard work by our Solaris engineering group.  It has a number of new and advanced features including simplified deployment and lifecycle management tools, built-in certified virtualization technologies, support on the latest generation SPARC chips, and more. New installations of the E-Business Suite R12 on this platform will require version 12.1.1 or higher and the latest Rapid Install startCD version 12.1.1.13.  For existing 12.1 installations, we have also certified an "in place" OS upgrade or the use of cloning to a target Solaris 11 system. There are also specific requirements to upgrade technology components such as the Oracle Database and Fusion Middleware.  These requirements are noted in the links below. References Oracle E-Business Suite Installation and Upgrade Notes Release 12 (12.1.1) for Oracle Solaris on SPARC (64-bit) (My Oracle Support Document 761568.1) Oracle Database Installation Guide 11g Release 2 (11.2) for Oracle Solaris Interoperability Notes Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) (My Oracle Support Document 1058763.1) Cloning Oracle Applications Release 12 with Rapid Clone (My Oracle Support Document 406982.1) Related Articles New Rapid Install StartCD (12.1.1.13) for Oracle E-Business Suite Release 12.1 Now Available Oracle E-Business Suite Release 12.1.3 Now Available

    Read the article

  • To display the field values submitted with AJAX [closed]

    - by work
    Here is the code:I want to post the field values entered in this code to the page ajaxpost.php using Ajax and then do some operations there. What would be code required to be written in ajaxpost.php <html> <head> <script type="text/javascript"> function loadXMLDoc() { var xmlhttp; if (window.XMLHttpRequest) {// code for IE7+, Firefox, Chrome, Opera, Safari xmlhttp=new XMLHttpRequest(); } else {// code for IE6, IE5 xmlhttp=new ActiveXObject("Microsoft.XMLHTTP"); } xmlhttp.onreadystatechange=function() { if (xmlhttp.readyState==4 && xmlhttp.status==200) { document.getElementById("myDiv").innerHTML=xmlhttp.responseText; } } var zz=document.f1.dd.value; //alert(zz); var qq= document.f1.cc.value; xmlhttp.open("POST","ajaxpost.php",true); xmlhttp.setRequestHeader("Content-type","application/x-www-form-urlencoded"); xmlhttp.send("dd=zz&cc=qq"); } </script> </head> <body> <h2>AJAX</h2> <form name="f1"> <input type="text" name="dd"> <input type="text" name="cc"> <button type="button" onclick="loadXMLDoc()">Request data</button> <div id="myDiv"></div> </form> </body> </html>

    Read the article

  • Create Site Definition in SharePoint2010 Part2

    - by ybbest
    In the last post, I have showed you how to create a simple site definition. In this post, I will continue with adding more features and customization to the site definition. Create a Top Nav bar for the home page. You need to modify the Onet.xml file by adding NavBar Child into NavBars element. (1002 is the magic number for top nav) and then adding NavBarPage under the File element as highlighted below in the picture. Next, I will include all the site and web features for all the list template and other features that are available in team site. Open OOB team site template by going to 14àTemplate àSiteTemplatesàstsàxmlàONET.XML Copy the web features and site features from the file we just opened to your site definition ONET.XML file. Finally I will include all the document template , copy document templates element from the file we just opened to your own ONET.XML Redeploy your solution and you will see all the document template and list template in your custom site.(Remember to delete all the sites using previous version of the custom definition before deploying and recreation the site.) You can download the complete solution here.

    Read the article

  • Save .mov file with applescript

    - by Frost Shadow
    I've installed the Perian addon for Quicktime so it can open .flv files, and then I can save them as .m4v or .mov. I'm trying to make an Applescript to convert from .flv to .m4v automatically by using this tutorial and butchering their example applescript file, which normally converts ChemDraw files (.cdx, .cml, .mol) to .tiff, so that it instead uses Quicktime to save the .flv files as .m4v. When I try to use it, though, I get an error "QuickTime Player got an error: document 1 doesn't understand the save message". My save message is currently: save first document in target_path as ".m4v" which looks like the QuickTime dictionary's instructions: save specifier : The document(s) or window(s) to save. [as saveable file format] : The file format to use. I've also tried "m4v", without the period, and still get the error. Is my Save direction wrong, or is it probably an error from trying to use Quicktime instead of the original ChemDraw? I tried to change references to .cdx, .cml, .mol, .tiff, and ChemDraw to .flv, .m4v, and QuickTime respectively, but maybe it's more complicated than that? I would, in fact, appreciate any example showing how to save an application file (ex: a TextEdit .rtf or .txt), as I can't seem to get any kind of file to save using applescript.

    Read the article

  • Portal and Content - Components, part 3 – Applied Customization Framework (4 of 7)

    - by Stefan Krantz
    Have you ever been challenged with the situation where your work task asks you to implement functionality in the WebCenter Portal and you browse through the Resource Catalog (Business Dictionary) and find the functionality you need. However when you get started there is small short comings and you ask your self- how can I re-use what is out of the box ca?- I wonder what code I need to use to produce the similar functions and include my new requirements?- Must I write a new taskflow? The answer to above questions are in many times answered with simply you can  do a taskflow customization to out-of-the-box taskflows. In this post I will help you understand how to do such customization. Best described is a 4 step process, see image flow below for illustration: Just to clarify few naming confusions that might occur when go through above process. Customization Role is a function within JDeveloper that will allow you to implement view and flow customizations to existing taskflows WebCenter Portal – Spaces Taskflow Customization Framework this technology scope do not only refer to WebCenter Spaces, this also include WebCenter Portal/Framework A taskflow customization do not overwrite or replace any code, it just creates an additional tip view of the taskflow in the MDS for the current application (WebCenter Portal or WebCenter Spaces) To sum up this simple procedure I also like to help you find your way around the main topic for this post series, this post series is focusing primarily on Content integration with WebCenter Portal, so where can I find content related taskflows in the WebCenter Libraries. The list below mention some useful locations to taskflows and each taskflow page fragments. Library Reference - WebCenter Document Library Service View Content Presenter Path: oracle.webcenter.doclib.view.jsf.taskflows.presenterTaskflow: contentPresenter.xml - The Content Presenter taskflowTaskflow: contentPresenterWizard.xml - The publishing wizard to select content, select template and preview including contributionDocument Manager Path: oracle.webcenter.doclib.view.jsf.taskflows.docManager Taskflow: documentManager.xml - The Document Manager taskflow which includes references to document management feature including browsing, download, uploading and viewing. For more information on Taskflow customizations please see following documentation:http://docs.oracle.com/cd/E23943_01/webcenter.1111/e10148/jpsdg_taskflows.htm#BACIEGJD

    Read the article

  • Can I password protect a Publisher file?

    - by tombull89
    I was asked ealier this week if it was possible to password protect a Microsoft Office 2007 Publisher document. I was under the impression that it would be like protecting a Word document, by going to Office Save As Word Document Tools General Options and creating a password to modify, like shown below. This also works for Excel documents. However, in Publisher 2007 the option is not there. The only option under "Tools" is "Map network drive". We overcame the issue as saving as a PDF and distributing that, but is there a way to do what we want?

    Read the article

  • Disable MathML output of eLyXer

    - by Gryllida
    eLyXer is a standalone LyX to HTML converter. In the resulting file, equations are formatted as MathML, and the file itself starts with an XML tag. This causes two problems: LibreOffice does not read the XML file (it can read HTML files, but not XHTML). I am unable to copy and paste the equations into a document editor such as LibreOffice with the goal of subsequent conversion into .doc, because .doc files do not support MathML. The eLyXer help page mentions an option to only use simple math, but there is no option to set math equations to output as images. And I already set Document Settings Output Math equations Format: images in LyX, which presumably is saved in the lyx document somewhere. A web search did not come up with any solutions.

    Read the article

  • Free Google Docs alternative compatible with Opera

    - by f4k3
    Well gDocs isn't working ok, too many bugs and it's pretty slow (especially when saving documents). I have tried several alternatives: - Zoho - they say it's not compatible with Opera and it;s true - you even can't CTRL+V text - Buzzword - it's really slow, and some functions don't work properly (on all browsers) for example "increase indent" increases a random text indent - Etherpad - was taken over by google and is shut down - Peepel - it's a cool thing, almost a free virtual desktop in a browser but it's buggy - a saved a document, tried to open it end an error occured. the document was lost - OpenGoo - went commercial At the moment I'm testing ThinkFree Online - it'a a bit slow (Java :P) and some minor things don't work (like drag a toolbar) but it has cool functionalities (almost like OpenOffice! which I use at home), it actually works with opera (create, save, edit document). Maybe I'll try Scribd but is it a office/share platform? any other worth trying??

    Read the article

  • Loading XML file containing leading zeros with SSIS preserving the zeros

    - by Compudicted
    Visiting the MSDN SQL Server Integration Services Forum oftentimes I could see that people would pop up asking this question: “why I am not able to load an element from an XML file that contains zeros so the leading/trailing zeros would remain intact?”. I started to suspect that such a trivial and often-required operation perhaps is being misunderstood by the developer community. I would also like to add that the whole state of affairs surrounding the XML today is probably also going to be increasingly affected by a motion of people who dislike XML in general and many aspects of it as XSD and XSLT invoke a negative reaction at best. Nevertheless, XML is in wide use today and its importance as a bridge between diverse systems is ever increasing. Therefore, I deiced to write up an example of loading an arbitrary XML file that contains leading zeros in one of its elements using SSIS so the leading zeros would be preserved keeping in mind the goal on simplicity into a table in SQL Server database. To start off bring up your BIDS (running as admin) and add a new Data Flow Task (DFT). This DFT will serve as container to adding our XML processing elements (besides, the XML Source is not available anywhere else other than from within the DFT). Double-click your DFT and drag and drop the XMS Source component from the Tool Box’s Data Flow Sources. Now, let the fun begin! Being inspired by the upcoming Christmas I created a simple XML file with one set of data that contains an imaginary SSN number of Rudolph containing several leading zeros like 0000003. This file can be viewed here. To configure the XML Source of course it is quite intuitive to point it to our XML file, next what the XML source needs is either an embedded schema (XSD) or it can generate one for us. In lack of the one I opted to auto-generate it for me and I ended up with an XSD that looked like: <?xml version="1.0"?> <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:element name="XMasEvent"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="CaseInfo"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="ID" type="xs:unsignedByte" /> <xs:element minOccurs="0" name="CreatedDate" type="xs:unsignedInt" /> <xs:element minOccurs="0" name="LastName" type="xs:string" /> <xs:element minOccurs="0" name="FirstName" type="xs:string" /> <xs:element minOccurs="0" name="SSN" type="xs:unsignedByte" /> <!-- Becomes string -- > <xs:element minOccurs="0" name="DOB" type="xs:unsignedInt" /> <xs:element minOccurs="0" name="Event" type="xs:string" /> <xs:element minOccurs="0" name="ClosedDate" /> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xs:schema> As an aside on the XML file: if your XML file does not contain the outer node (<XMasEvent>) then you may end up in a situation where you see just one field in the output. Now please note that the SSN element’s data type was chosen to be of unsignedByte (and this is for a reason). The reason is stemming from the fact all our figures in the element are digits, this is good, but this is not exactly what we need, because if we will attempt to load the data with this XSD then we are going to either get errors on the destination or most typically lose the leading zeros. So the next intuitive choice is to change the data type to string. Besides, if a SSIS package was already created based on this XSD and the data type change was done thereafter, one should re-set the metadata by right-clicking the XML Source and choosing “Advanced Editor” in which there is a refresh button at the bottom left which will do the trick. So far so good, we are ready to load our XML file, well actually yes, and no, in my experience typically some data conversion may be required. So depending on your data destination you may need to tweak the data types targeted. Let’s add a Data Conversion Task to our DFT. Your package should look like: To make the story short I only will cover the SSN field, so in my data source the target SQL Table has it as nchar(10) and we chose string in our XSD (yes, this is a big difference), under such circumstances the SSIS will complain. So will go and manipulate on the data type of SSN by making it Unicode String (DT_WSTR), World String per se. The conversion should look like: The peek at the Metadata: We are almost there, now all we need is to configure the destination. For simplicity I chose SQL Server Destination. The mapping is a breeze, F5 and I am able to insert my data into SQL Server now! Checking the zeros – they are all intact!

    Read the article

  • Tracking Outgoing Links With Google Analytics Events

    - by the_archer
    I've been trying to track clicks on external links on my website using the events tracking method. So I've got my Google Analytics code setup before body ends as shown below (note: quotes have been entitied by blogger, but it works fine): <script type='text/javascript'> var _gaq = _gaq || []; _gaq.push([&#39;_setAccount&#39;, &#39;UA-XXXXXXX-X#39;]); _gaq.push([&#39;_trackPageview&#39;]); (function() { var ga = document.createElement(&#39;script&#39;); ga.type = &#39;text/javascript&#39;; ga.async = true; ga.src = (&#39;https:&#39; == document.location.protocol ? &#39;https://ssl&#39; : &#39;http://www&#39;) + &#39;.google-analytics.com/ga.js&#39;; var s = document.getElementsByTagName(&#39;script&#39;)[0]; s.parentNode.insertBefore(ga, s); })(); </script> Now I wanted to track a link on the addthis.com follow widget. So there is a link of the type below to which following instructions from here I added the onclick event. <a addthis:url='http://feeds.feedburner.com/myfeedburnerlurl' onClick="_gaq.push(['_trackEvent', 'Subscription Clicks', 'RSS']);" class='addthis_button_rss_follow'/> I clicked on it a couple of times, left it for over a day now, but nothing shows up in google analytics events. It just says zero events. Here's a screenshot of the events page on GA: Could anybody help me? Am I doing anything wrong?

    Read the article

  • Using Quickly for text-heavy app

    - by Kevin
    I am trying to create a small app that displays documentation. When it is run, the application window will display a main menu with buttons labeled 'Document 1', 'Document 2', etc. If a user clicks on one of those buttons, the text from the corresponding document will be displayed in the window. Very basic. The text documents range in length from 1000 to 5000 words, and they need basic formatting (bold, italic, maybe one or two font choices). My question is this: what is the best way to store and display long blocks of formatted text, using Quickly? There seems to be a few options: (1) I could load the text blocks into long python strings, (2) I could load the text from text files, or (3) I could somehow copy and paste the formatted text into Glade. In the first two options, I'm not sure how I would format the text (add italic and bold, for instance) once it was loaded. I have experience with PHP/MySQL/HTML/CSS/Javascript, but I'm new to Python. Any help would be appreciated.

    Read the article

  • Automatically detect faces in a picture

    - by abel
    At my work place, passport sized photographs are scanned together, then cut up into individual pictures and saved with unique file numbers. Currently we use Paint.net to manually select, cut and save the pictures. I have seen Sony's Cybershot Camera has face detection. Google also gives me something about iphoto when searching for face detection. Picasa has facedetection too. Are there any ways to autodetect the faces in a document, which would improve productivity at my workplace by reducing the time needed to cut up individual images. Sample Scanned Document(A real document has 5 rows of 4 images each=20 pics): (from: http://www.memorykeeperphoto.com/images/passport_photo.jpg, fairuse) For eg. In Picasa 3.8, On clicking View People, all the faces are shown and I am asked to name them, can I save these individual pictures automatically with the names as different pictures.

    Read the article

  • Google reverse an analytic

    - by Dan
    I am confused about what code must be executed to reverse a google analytic. I have the following code pasted within a test page: <body onLoad=”function()”> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-25305776-3']); _gaq.push(['_trackPageview']); _gaq.push(['_addTrans', '11455', // order ID - required '-42.38', // total - required '-2.38', // tax '-15.00' // shipping ]); _gaq.push(['_addItem', '11455', // order ID - necessary to associate item with transaction 'Evan Turner Turningpoint™ Basketball Pants', // product name '25.00', // unit price - required '-1' // quantity - required ]); _gaq.push(['_trackTrans']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Is this correct? Thanks!

    Read the article

  • "System.Data.OracleClient requires Oracle client software version 8.1.7 or greater." Error Message

    - by Jandost Khoso
    Quick resolution: Give full permission to AUTHENTICATED USERS in following folders. a) ORACLE_HOME b) Program Files\ORACLE   Check your PATH. You might have installed different clients in your system and your .NET application is pointing to a home with inappoperiate client. What your .NET application should load is OCI.DLL with File version more than 8.1.7. According to the MSDN document Oracle and ADO.NET:   "The .NET Framework Data Provider for Oracle provides access to an Oracle database using the Oracle Call Interface (OCI) as provided by Oracle Client software. The functionality of the data provider is designed to be similar to that of the .NET Framework data providers for SQL Server, OLE DB, and ODBC. "     The MSDN document System Requirements (Oracle) says: "The .NET Framework Data Provider for Oracle requires Microsoft Data Access Components (MDAC) version 2.6 or later. MDAC 2.8 SP1 is recommended. You must also have Oracle 8i Release 3 (8.1.7) Client or later installed. "   Both the .NET Framework Data Provider for Oracle and Oracle Data Provider for .NET are data providers to access Oracle database. The former ships with .NET Framework and requires Oracle client version 8.1.7 or above. The latter is provided by Oracle company and requires Oracle client version 9.2 or later.     The Oracle Data Provider for .NET (ODP.NET) features optimized ADO.NET data access to the Oracle database. ODP.NET allows developers to take advantage of advanced Oracle database functionality, including Real Application Clusters, XML DB, and advanced security.   See the document Comparing the Microsoft .NET Framework 1.1 Data Provider for Oracle and the Oracle Data Provider for .NET for more information about the difference.

    Read the article

  • Cutting and pasting in MS Word: hourglass pops and it takes longer than expected

    - by Rax Olgud
    I work with MS Word 2007. Today I created a new document, and for some reason cutting and pasting text (using Ctrl-X and Ctrl-V) takes longer than expected. To clarify, here's the process: I select a single word in the document I click Ctrl-X The hourglass shows up for 1-2 seconds The word is cut The same happens for pasting (i.e. 1-2 seconds of hourglass). This document is ~5 pages long, with nothing fancy. I have plenty of available RAM and my CPU usage is around 1-2%, there's not peak during the cut/paste. Any thoughts on what can cause this and what I can do against it?

    Read the article

  • IF commands in a batch file

    - by Rossaluss
    I'm writing a small batch file to replace users' themes and charts in Office and I have the below batch file that works just fine. cd c:\documents and settings\%username%\application data\microsoft\templates echo Y|rmdir charts /s mkdir charts echo Y|del "c:\documents and settings\%username%\application data\microsoft\templates\document themes\*.*" net use o: \\servername\sms copy "o:\ppt themes\charts\*.*" "c:\documents and settings\%username%\application data\microsoft\templates\charts" copy "o:\ppt themes\Document Themes\*.*" "c:\documents and settings\%username%\application data\microsoft\templates\document themes" c: net use o: /delete Now what I want is the above to only run if it hasn't run before as we'll be pushing this out to all users for around 2 weeks to catch people that aren't in every day. Is there any way to begin the command with something to look for one of the new themes/charts already pushed down, and if it's present, then have it not run? Any help on this would be greatly appreciated as I'm pretty new to these batch files.

    Read the article

< Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >