Search Results

Search found 37260 results on 1491 pages for 'command query responsibil'.

Page 669/1491 | < Previous Page | 665 666 667 668 669 670 671 672 673 674 675 676  | Next Page >

  • ODBC and Excel (2 replies)

    Hello, I am using the following connection to Query and Excel Spreadsheet: AConnectionString &quot;Driver {Microsoft Excel Driver (*.xls)};DriverId 790;Dbq &quot; &amp; ofdSelectFile.FileName &amp; &quot;;DefaultDir c:\;&quot; ASourceConnection New Odbc.OdbcConnection(AConnectionString) Dim ADataAdapter as new odbc.odbcDataAdapter(&quot;SELECT * FROM $Sheet1&quot;, ASourceConnection) ADataAdapter.Fill(MyDataset) This works Great, howe...

    Read the article

  • Get more information about the crash

    - by Tito
    when i issue the command last in my terminal i see the following entries i.e. "crash" How can i find out more information about the crash: Is there any system dump file generated or a backtrace file that i can see or analyze. reboot system boot 3.2.0-32-generic Mon Nov 12 23:58 - 09:13 (09:14) tito pts/1 192.168.26.5 Mon Nov 12 23:56 - crash (00:01) tito pts/4 192.168.26.5 Mon Nov 12 22:46 - crash (01:12)

    Read the article

  • MSCC: Scripting - Administrator's­ toolbox of magic...

    Finally, we made it to have our April meetup - in May. The most obvious explanation is the increased amount of open source and IT activities that either the MSCC, the Linux User Group of Mauritius (LUGM), or the University of Mauritius Student's Computer Club is organising. It's absolutely incredible to see the recent hype of events here on the island. And I'm loving it! Unfortunately, we also had to deal with arranging for a location this time. It was kind of an odyssey as my requests (and phone calls) haven't been answered, even though I tried it several times - well, kind of disappointing and I have to look into that for future gatherings. In my opinion, it is essential that two parameters of a community meeting are fixed as early as possible: Location, and Date and time You can't just change one or both on the very last minute. Well, this time we had to do it due to unforeseen reasons, and I apologise to any MSCC member which couldn't make it to our April meetup. Okay, lesson learned but now back to the actual meetup report ... Shortly after the meeting I placed the following statement as my first impression: "Spontaneous and improvised :) No, seriously, Ish and Dan had well prepared presentations on shell scripting, mainly focused towards Bourne Again Shell (bash), and the pros and cons of scripting versus actually writing something in a decent programming language. I thought that I could cut myself out of the equation but the demand for information about PowerShell was higher than expected..." Well, it turned out that the interest in Windows PowerShell was high, as I even got a couple of questions on it via social media networks during the evening. I also like to mention that the number of attendees went back to what I would call a "standard" number of participation. This time there were 12 craftsmen, but again a good number of First Timers. Reactions of other attendees Here are some impressions and feedback from our participants: "Enjoyed the bash and powershell (linux / windows) presentations ..." -- Nadim on event comments "He [Daniel] also showed us some syntax loopholes in Bash that could leave someone with bad code." -- Ish on MSCC – Let's talk about Scripting   Glad to see a couple of first time attendees, especially students from the university itself. Some details on the presentations MSCC: First time visit at the University of Mauritius - Phase II Engineering Tower, room 2.9 Gimme some love ... bash and other shells Ish gave a great introduction into shell scripting as he spoke about existing shell environments and a little bit about their history. Furthermore, he talked about various built-in commands, the use of coreutils, the ability to daisy-chain multiple commands using pipes, the importance of the standard I/O streams and their file descriptors in advanced scripting techniques. Combined with a couple of sample statements in the Linux terminal on Ubuntu 14.04 machine it was a solid presentation. Have a closer look at his slides - published on his blog on MSCC – Let's talk about Scripting. Oddities of scripting After the brief introduction into bash it was Daniel's turn to highlight a good number of oddities when working with shell scripts. First of all, it should be clear that scripting is not supposed for any kind of implementations in terms of software but simply to automate administrative procedures and to simplify routine jobs on a system. One of the cool oddities that he mentioned is that everything (!) in a shell is represented by strings; there are no other types like integer, float, date-time, etc. that you'd like to use in a full-fledged programming language. Let's have a look at his sample:  more to come... What's the output? As a conclusion, Daniel suggests that shell scripting should be limited but not restricted to automatic repetitive command stacks and batch jobs, startup wrapper for applications in order to set up the execution environment, and other not too sophisticated jobs. But as soon as it might involve a little bit more logic or you might rely on performance it's better to write an application in Ruby, Python, or Perl (among others of course). This is also enables the possibility to test your code properly. MSCC: Ish talking about Bourne Again Shell (bash) and shell scripting to automate regular tasks MSCC: Daniel gives an overview about the pros and cons of shell scripting versus programming MSCC: PowerShell as your scripting solution on Windows operating systems The path of the Enlightened is long ... and tough. Honestly, even though PowerShell was mentioned without any further details on the meetup's agenda, I didn't expect that there would be demand to give a presentation on Microsoft PowerShell after all. I already took this topic out of the announcement but the audience wanted to have some information. Okay, then let's see what I could do - improvised style. While my machine booted and got hooked up to the projector, I started to talk about the beginnings of PowerShell from back in 2006, and its predecessors MS DOS and Command Prompt. A throwback in history... always good for young people. As usual, Microsoft didn't get it at that time. Instead of listening to their client's needs and demands they ignored the feasibility to administrate Windows server farms without any UI tools. PowerShell is actually a result of this, and seeing that shell scripting is a common, reliable and fast way in an administrator's toolbox for decades, Microsoft had to adapt from their Microsoft Management Console (MMC) to a broader approach. It's not like shell scripting was something new; it is in daily use by alternative operating systems like AIX, HP UX, Solaris, and last but not least Linux. Most interestingly, Microsoft is very good at renovating existing architectures, and over the years PowerShell not only replaced their own combination of Command Prompt and Scripting Hosts (VBScript and CScript) but really turned into a challenging competitor on the market. The shell is easy to extend with cmdlets, and open to other Microsoft products like SQL Server, SharePoint, as well as Third-party software applications. Similar to MMC PowerShell also offers the ability to administer other machine remotely - only without a graphical user interface and therefore it's easier to automate and schedule regular tasks. Following is a sample of a PowerShell script file (extension .ps1): $strComputer = "." $colItems = get-wmiobject -class Win32_BIOS -namespace root\CIMV2 -comp $strComputer foreach ($objItem in $colItems) {write-host "BIOS Characteristics: " $objItem.BiosCharacteristicswrite-host "BIOS Version: " $objItem.BIOSVersionwrite-host "Build Number: " $objItem.BuildNumberwrite-host "Caption: " $objItem.Captionwrite-host "Code Set: " $objItem.CodeSetwrite-host "Current Language: " $objItem.CurrentLanguagewrite-host "Description: " $objItem.Descriptionwrite-host "Identification Code: " $objItem.IdentificationCodewrite-host "Installable Languages: " $objItem.InstallableLanguageswrite-host "Installation Date: " $objItem.InstallDatewrite-host "Language Edition: " $objItem.LanguageEditionwrite-host "List Of Languages: " $objItem.ListOfLanguageswrite-host "Manufacturer: " $objItem.Manufacturerwrite-host "Name: " $objItem.Namewrite-host "Other Target Operating System: " $objItem.OtherTargetOSwrite-host "Primary BIOS: " $objItem.PrimaryBIOSwrite-host "Release Date: " $objItem.ReleaseDatewrite-host "Serial Number: " $objItem.SerialNumberwrite-host "SMBIOS BIOS Version: " $objItem.SMBIOSBIOSVersionwrite-host "SMBIOS Major Version: " $objItem.SMBIOSMajorVersionwrite-host "SMBIOS Minor Version: " $objItem.SMBIOSMinorVersionwrite-host "SMBIOS Present: " $objItem.SMBIOSPresentwrite-host "Software Element ID: " $objItem.SoftwareElementIDwrite-host "Software Element State: " $objItem.SoftwareElementStatewrite-host "Status: " $objItem.Statuswrite-host "Target Operating System: " $objItem.TargetOperatingSystemwrite-host "Version: " $objItem.Versionwrite-host} Which gives you information about your BIOS and Windows OS. Then change the computer name to another one on your network (NetBIOS based) and run the script again. There lots of samples and tutorials at the Microsoft Script Center, and I would advise you to pay a visit over there if you are more interested in PowerShell. The Script Center provides the download links, too. Upcoming Events What are the upcoming events here in Mauritius? So far, we have the following ones (incomplete list as usual) in chronological order: Hacking Defence (14. May 2014) WebCup Maurice (7. & 8. June 2014) Developers Conference (TBA ~ July 2014) Linuxfest 2014 (TBA ~ November 2014) Hopefully, there will be more announcements during the next couple of weeks and months. If you know about any other event, like a bootcamp, a code challenge or hackathon here in Mauritius, please drop me a note in the comment section below this article. Thanks! My resume of the day Spontaneous and improvised :) The new location at the University of Mauritius turned out very well, there is plenty of space, and it could be a good choice for future meetings. Especially, having the ability to get more and more students into our IT community sounds like a great opportunity. Later during the day, I got some promising mails from Nadim regarding future sessions at the local branch of the Middlesex University. Well, we will see in the future... But for now this will be on hold until approximately October when students resume their regular studies. Anyway, it was a good experience at the university, and thanks again to the UoM Student's Computer Club that made the necessary arrangements for the MSCC!

    Read the article

  • A pseudo-listener for AlwaysOn Availability Groups for SQL Server virtual machines running in Azure

    - by MikeD
    I am involved in a project that is implementing SharePoint 2013 on virtual machines hosted in Azure. The back end data tier consists of two Azure VMs running SQL Server 2012, with the SharePoint databases contained in an AlwaysOn Availability Group. I used this "Tutorial: AlwaysOn Availability Groups in Windows Azure (GUI)" to help me implement this setup.Because Azure DHCP will not assign multiple unique IP addresses to the same VM, having an AG Listener in Azure is not currently supported.  I wanted to figure out another mechanism to support a "pseudo listener" of some sort. First, I created a CNAME (alias) record in the DNS zone with a short TTL (time to live) of 5 minutes (I may yet make this even shorter). The record represents a logical name (let's say the alias is SPSQL) of the server to connect to for the databases in the availability group (AG). When Server1 was hosting the primary replica of the AG, I would set the CNAME of SPSQL to be SERVER1. When the AG failed over to Server1, I wanted to set the CNAME to SERVER2. Seemed simple enough.(It's important to point out that the connection strings for my SharePoint services should use the CNAME alias, and not the actual server name. This whole thing falls apart otherwise.)To accomplish this, I created identical SQL Agent Jobs on Server1 and Server2, with two steps:1. Step 1: Determine if this server is hosting the primary replica.This is a TSQL step using this script:declare @agName sysname = 'AGTest'set nocount on declare @primaryReplica sysnameselect @primaryReplica = agState.primary_replicafrom sys.dm_hadr_availability_group_states agState   join sys.availability_groups ag on agstate.group_id = ag.group_id   where ag.name = @AGname if not exists(   select *    from sys.dm_hadr_availability_group_states agState   join sys.availability_groups ag on agstate.group_id = ag.group_id   where @@Servername = agstate.primary_replica    and ag.name = @AGname)begin   raiserror ('Primary replica of %s is not hosted on %s, it is hosted on %s',17,1,@Agname, @@Servername, @primaryReplica) endThis script determines if the primary replica value of the AG group is the same as the server name, which means that our server is hosting the current AG (you should update the value of the @AgName variable to the name of your AG). If this is true, I want the DNS alias to point to this server. If the current server is not hosting the primary replica, then the script raises an error. Also, if the script can't be executed because it cannot connect to the server, that also will generate an error. For the job step settings, I set the On Failure option to "Quit the job reporting success". The next step in the job will set the DNS alias to this server name, and I only want to do that if I know that it is the current primary replica, otherwise I don't want to do anything. I also include the step output in the job history so I can see the error message.Job Step 2: Update the CNAME entry in DNS with this server's name.I used a PowerShell script to accomplish this:$cname = "SPSQL.contoso.com"$query = "Select * from MicrosoftDNS_CNAMEType"$dns1 = "dc01.contoso.com"$dns2 = "dc02.contoso.com"if ((Test-Connection -ComputerName $dns1 -Count 1 -Quiet) -eq $true){    $dnsServer = $dns1}elseif ((Test-Connection -ComputerName $dns2 -Count 1 -Quiet) -eq $true) {   $dnsServer = $dns2}else{  $msg = "Unable to connect to DNS servers: " + $dns1 + ", " + $dns2   Throw $msg}$record = Get-WmiObject -Namespace "root\microsoftdns" -Query $query -ComputerName $dnsServer  | ? { $_.Ownername -match $cname }$thisServer = [System.Net.Dns]::GetHostEntry("LocalHost").HostName + "."$currentServer = $record.RecordData if ($currentServer -eq $thisServer ) {     $cname + " CNAME is up to date: " + $currentServer}else{    $cname + " CNAME is being updated to " + $thisServer + ". It was " + $currentServer    $record.RecordData = $thisServer    $record.put()}This script does a few things:finds a responsive domain controller (Test-Connection does a ping and returns a Boolean value if you specify the -Quiet parameter)makes a WMI call to the domain controller to get the current CNAME record value (Get-WmiObject)gets the FQDN of this server (GetHostEntry)checks if the CNAME record is correct and updates it if necessary(You should update the values of the variables $cname, $dns1 and $dns2 for your environment.)Since my domain controllers are also hosted in Azure VMs, either one of them could be down at any point in time, so I need to find a DC that is responsive before attempting the DNS call. The other little thing here is that the CNAME record contains the FQDN of a machine, plus it ends with a period. So the comparison of the CNAME record has to take the trailing period into account. When I tested this step, I was getting ACCESS DENIED responses from PowerShell for the Get-WmiObject cmdlet that does a remote lookup on the DC. This occurred because the SQL Agent service account was not a member of the Domain Admins group, so I decided to create a SQL Credential to store the credentials for a domain administrator account and use it as a PowerShell proxy (rather than give the service account Domain Admins membership).In SQL Management Studio, right click on the Credentials node (under the server's Security node), and choose New Credential...Then, under SQL Agent-->Proxies, right click on the PowerShell node and choose New Proxy...Finally, in the job step properties for the PowerShell step, select the new proxy in the Run As drop down.I created this two step Job on both nodes of the Availability Group, but if you had more than two nodes, just create the same job on all the servers. I set the schedule for the job to execute every minute.When the server that is hosting the primary replica is running the job, the job history looks like this:The job history on the secondary server looks like this: When a failover occurs, the SQL Agent job on the new primary replica will detect that the CNAME needs to be updated within a minute. Based on the TTL of the CNAME (which I said at the beginning was 5 minutes), the SharePoint servers will get the new alias within five minutes and should be able to reconnect. I may want to shorten up the TTL to reduce the time it takes for the client connections to use the new alias. Using a DNS CNAME and a SQL Agent Job on all servers hosting AG replicas, I was able to create a pseudo-listener to automatically change the name of the server that was hosting the primary replica, for a scenario where I cannot use a regular AG listener (in this case, because the servers are all hosted in Azure).    

    Read the article

  • A Small Utility to Delete Files recursively by Date

    - by Rick Strahl
    It's funny, but for me the following seems to be a recurring theme: Every few months or years I end up with a host of files on my server that need pruning selectively and often under program control. Today I realized that my SQL Server logs on my server were really piling up and nearly ran my backup drive out of drive space. So occasionally I need to check on that server drive and clean out files. Now with a bit of work this can be done with PowerShell or even a complicated DOS batch file, but heck, to me it's always easier to just create a small Console application that handles this sort of thing with a full command line parser and a few extra options, plus in the end I end up with code that I can actually modify and add features to as is invariably the case. No more searching for a script each time :-) So for my typical copy needs the requirements are: Need to recursively delete files Need to be able to specify a filespec (ie. *.bak) Be able to specify a cut off date before which to delete files And it'd be nice to have an option to send files to the Recycle bin just in case for operator error :-)(and yes that came in handy as I blew away my entire database backup folder by accident - oops!) The end result is a small Console file copy utility that I popped up on Github: https://github.com/RickStrahl/DeleteFiles The source code is up there along with the binary file you can just run. Creating DeleteFiles It's pretty easy to create a simple utility like DeleteFiles of course, so I'm not going to spend any talking about how it works. You can check it out in the repository or download and compile it. The nice thing about using a full programming language like C over something like PowerShell or batch file is that you can make short work of the recursive tree walking that's required to make this work. There's very little code, but there's also a very small, self-contained command line parser in there that might be useful that can be plugged into any project - I've been using it quite a bit for just about any Console application I've been building. If you're like me and don't have the patience or the persistence (that funky syntax requires some 'sticking with it' that I simply can't get over) to get into Powershell coding, having an executable file that I can just copy around or keep in my Utility directory is the only way I'll ever get to reuse this functionality without going on a wild search each time :-) Anyway, hope some of you might find this useful. © Rick Strahl, West Wind Technologies, 2005-2012Posted in Windows  CSharp   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Garbled UI with vnc, chicken-of-the-vnc, gridengine, qmon

    - by bmargulies
    Ubuntu 11.04 gridengine version 6.2u5-1ubuntu1 I start a desktop with the vncserver command. My ~/.vnc/xstartup ends with 'gnome-session'. I connect to it using chicken-of-the-VNC on my MacOSX Lion system. I run 'qmon'. Much of qmon works, but several critical tasks show hopelessly garbled grid layouts. I have filed a bug report, but I suspect that there is something I am missing that would render (ahem) them legible.

    Read the article

  • How to remove "online accounts" from "system settings"?

    - by A_txt
    I uninstalled "Unity" and use "Gnome Shell" instead, but the new "online accounts" was still in the "system settings" menu. How can I completely remove it? I tried the command below but it doesn't work: sudo apt-get -y remove unity-lens-shopping account-plugin-aim account-plugin-facebook account-plugin-flickr account-plugin-google account-plugin-icons account-plugin-identica account-plugin-jabber account-plugin-salut account-plugin-twitter account-plugin-windows-live account-plugin-yahoo gnome-online-accounts

    Read the article

  • Creating A SharePoint Parent/Child List Relationship&ndash; SharePoint 2010 Edition

    - by Mark Rackley
    Hey blog readers… It has been almost 2 years since I posted my most read blog on creating a Parent/Child list relationship in SharePoint 2007: Creating a SharePoint List Parent / Child Relationship - Out of the Box And then a year ago I improved on my method and redid the blog post… still for SharePoint 2007: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX Since then many of you have been asking me how to get this to work in SharePoint 2010, and frankly I have just not had time to look into it. I wish I could have jumped into this sooner, but have just recently began to look at it. Well.. after all this time I have actually come up with two solutions that work, neither of them are as clean as I’d like them to be, but I wanted to get something in your hands that you can start using today. Hopefully in the coming weeks and months I’ll be able to improve upon this further and give you guys some better options. For the most part, the process is identical to the 2007 process, but you have probably found out that the list view web parts in 2010 behave differently, and getting the Parent ID to your new child form can be a pain in the rear (at least that’s what I’ve discovered). Anyway, like I said, I have found a couple of solutions that work. If you know of a better one, please let us know as it bugs me that this not as eloquent as my 2007 implementation. Getting on the same page First thing I’d recommend is recreating this blog: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX in SharePoint 2010… There are some vague differences, but it’s basically the same…  Here’s a quick video of me doing this in SP 2010: Creating Lists necessary for this blog post Now that you have the lists created, lets set up the New Time form to use a QueryString variable to populate the Parent ID field: Creating parameters in Child’s new item form to set parent ID Did I talk fast enough through both of those videos? Hopefully by now that stuff is old hat to you, but I wanted to make sure everyone could get on the same page.  Okay… let’s get started. Solution 1 – XSLTListView with Javascript This solution is the more elegant of the two, however it does require the use of a little javascript.  The other solution does not use javascript, but it also doesn’t use the pretty new SP 2010 pop-ups.  I’ll let you decide which you like better. The basic steps of this solution are: Inserted a Related Item View Insert a ContentEditorWebPart Insert script in ContentEditorWebPart that pulls the ID from the Query string and calls the method to insert a new item on the child entry form Hide the toolbar from data view to remove “add new item” link. Again, you don’t HAVE to use a CEWP, you could just put the javascript directly in the page using SPD.  Anyway, here is how I did it: Using Related Item View / JavaScript Here’s the JavaScript I used in my Content Editor Web Part: <script type="text/javascript"> function NewTime() { // Get the Query String values and split them out into the vals array var vals = new Object(); var qs = location.search.substring(1, location.search.length); var args = qs.split("&"); for (var i=0; i < args.length; i++) { var nameVal = args[i].split("="); var temp = unescape(nameVal[1]).split('+'); nameVal[1] = temp.join(' '); vals[nameVal[0]] = nameVal[1]; } var issueID = vals["ID"]; //use this to bring up the pretty pop up NewItem2(event,"http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID); //use this to open a new window //window.location="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID; } </script> Solution 2 – DataFormWebPart and exact same 2007 Process This solution is a little more of a hack, but it also MUCH more close to the process we did in SP 2007. So, if you don’t mind not having the pretty pop-up and prefer the comforts of what you are used to, you can give this one a try.  The basics steps are: Insert a DataFormWebPart instead of the List Data View Create a Parameter on DataFormWebPart to store “ID” Query String Variable Filter DataFormWebPart using Parameter Insert a link at bottom of DataForm Web part that points to the Child’s new item form and passes in the Parent Id using the Parameter. See.. like I told you, exact same process as in 2007 (except using the DataFormWeb Part). The DataFormWebPart also requires a lot more work to make it look “pretty” but it’s just table rows and cells, and can be configured pretty painlessly.  Here is that video: Using DataForm Web Part One quick update… if you change the link in this solution from: <tr> <td><a href="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}">Click here to create new item...</a> </td> </tr> to: <tr> <td> <a href="javascript:NewItem2(event,'http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}');">Click here to create new item...</a> </td> </tr> It will open up in the pretty pop up and act the same as solution one… So… both Solutions will now behave the same to the end user. Just depends on which you want to implement. That’s all for now… Remember in both solutions when you have them working, you can make the “IssueID” invisible to users by using the “ms-hidden” class (it’s my previous blog post on the subject up there). That’s basically all there is to it! No pithy or witty closing this time… I am sorry it took me so long to dive into this and I hope your questions are answered. As I become more polished myself I will try to come up with a cleaner solution that will make everyone happy… As always, thanks for taking the time to stop by.

    Read the article

  • Sending Parameters with the BizTalk HTTP Adapter

    - by Christopher House
    I've never had occaison to use the BizTalk HTTP adapter since I've always needed SOAP rather than just POX (plain old XML).  Yesterday we decided that we're going to expose some data via a Java servlet that will accept an HTTP post and respond with POX.  I knew BizTalk had an HTTP adapter but I had no idea what it's capabilities were. After a quick read through the BizTalk docs, it was apparent that the HTTP send adapter does in fact do posts.  The concern I had though was how we were going to supply parameters to the servlet.  The examples I had seen using the HTTP adapter all involved posting an XML message to some HTTP location.  Our Java guy, however didn't want to take that approach.  He wanted us to provide a query string via post, much like you'd expect to see on an HTTP get.  I decided to put together a little test scenario and see what I could come up with.  We didn't have a test servlet I could go against and my Java experience is virtually nill, so I decided to put together an ASP.Net project to act as the servlet.  It didn't need to be fancy, just one HttpHandler that accepts a post, reads a parameter and returns XML.  With the HttpHandler done, I put together a simple orchestration to send a message to the handler.  I started by having the orch send a message of type System.String to see what it would look like when the handler received it. I set a breakpoint in my handler and kicked off the orchestration.  Below is what I saw: As I suspected, because of BizTalk's XML serialization, System.String was not going to work.  I thought back to my BizTalk 2004 days and I project I worked on that required sending HTML formatted emails via the SMTP adapter.  To acomplish that, I had used a .Net class with a custom serialization formatter that I got from a Microsoft sample.  The code for the class, RawString can be found here. I created a new class library with the RawString class as well as a static factory class, referenced that in my orchestration project and changed my message type from System.String to RawString.  Below is what the code in my message construction looks like: After deploying the updated orchestration, I fired it off again and checked the breakpoint in my HttpHandler.  This is what I saw: And there you have it.  The RawString message type allowed me to pass a query string in the HTTP post without wrapping it in XML.

    Read the article

  • Portal And Content - Content Integration - Best Practices

    - by Stefan Krantz
    Lately we have seen an increase in projects that have failed to either get user friendly content integration or non satisfactory performance. Our intention is to mitigate any knowledge gap that our previous post might have left you with, therefore this post will repeat some recommendation or reference back to old useful post. Moreover this post will help you understand ground up how to design, architect and implement business enabled, responsive and performing portals with complex requirements on business centric information publishing. Design the Information Model The key to successful portal deployments is Information modeling, it's a key task to understand the use case you designing for, therefore I have designed a set of question you need to ask yourself or your customer: Question: Who will own the content, IT or Business? Answer: BusinessQuestion: Who will publish the content, IT or Business? Answer: BusinessQuestion: Will there be multiple publishers? Answer: YesQuestion: Are the publishers computer scientist?Answer: NoQuestion: How often do the information changes, daily, weekly, monthly?Answer: Daily, weekly If your answers to the questions matches at least 2, we strongly recommend you design your content with following principles: Divide your pages in to logical sections, where each section is marked with its purpose Assign capabilities to each section, does it contain text, images, formatting and/or is it static and is populated through other contextual information Select editor/design element type WYSIWYG - Rich Text Plain Text - non-format text Image - Image object Static List - static list of formatted informationDynamic Data List - assembled information from multiple data files through CMIS query The result of such design map could look like following below examples: Based on the outcome of the required elements in the design column 3 from the left you will now simply design a data model in WebCenter Content - Site Studio by creating a Region Definition structure matching your design requirements.For more information on how to create a Region definition see following post: Region Definition Post - note see instruction 7 for details. Each region definition can now be used to instantiate data files, a data file will hold the actual data for each element in the region definition. Another way you can see this is to compare the region definition as an extension to the metadata model in WebCenter Content for each data file item. Design content templates With a solid dependable information model we can now proceed to template creation and page design, in this phase focuses on how to place the content sections from the region definition on the page via a Content Presenter template. Remember by creating content presenter templates you will leverage the latest and most integrated technology WebCenter has to offer. This phase is much easier since the you already have the information model and design wire-frames to base the logic on, however there is still few considerations to pay attention to: Base the template on ADF and make only necessary exceptions to markup when required Leverage ADF design components for Tabs, Accordions and other similar components, this way the design in the content published areas will comply with other design areas based on custom ADF taskflows There is no performance impact when using meta data or region definition based data All data access regardless of type, metadata or xml data it can be accessed via the Content Presenter - Node. See below for applied examples on how to access data Access metadata property from Document - #{node.propertyMap['myProp'].value}myProp in this example can be for instance (dDocName, dDocTitle, xComments or any other available metadata) Access element data from data file xml - #{node.propertyMap['[Region Definition Name]:[Element name]'].asTextHtml}Region Definition Name is the expect region definition that the current data file is instantiatingElement name is the element value you like to grab from the data file I recommend you read following  useful post on content template topic:CMIS queries and template creation - note see instruction 9 for detailsStatic List template rendering For more information on templates:Single Item Content TemplateMulti Item Content TemplateExpression Language Internationalization Considerations When integrating content assets via content presenter you by now probably understand that the content item/data file is wired to the page, what is also pretty common at this stage is that the content item/data file only support one language since its not practical or business friendly to mix that into a complex structure. Therefore you will be left with a very common dilemma that you will have to either build a complete new portal for each locale, which is not an good option! However with little bit of information modeling and clear naming convention this can be addressed. Basically you can simply make sure that all content item/data file are named with a predictable naming convention like "Content1_EN" for the English rendition and "Content1_ES" for the Spanish rendition. This way through simple none complex customizations you will be able to dynamically switch the actual content item/data file just before rendering. By following proposed approach above you not only enable a simple mechanism for internationalized content you also preserve the functionality in the content presenter to support business accessible run-time publishing of information on existing and new pages. I recommend you read following useful post on Internationalization topics:Internationalize with Content Presenter Integrate with Review & Approval processes Today the Review and approval functionality and configuration is based out of WebCenter Content - Criteria Workflows. Criteria Workflows uses the metadata of the checked in document to evaluate if the document is under any review/approval process. So for instance if a Criteria Workflow is configured to force any documents with Version = "2" or "higher" and Content Type is "Instructions", any matching content item version on check in will now enter the workflow before getting released for general access. Few things to consider when configuring Criteria Workflows: Make sure to not trigger on version one for Content Items that are Data Files - if you trigger on version 1 you will not only approve an empty document you will also have a content presenter pointing to a none existing document - since the document will only be available after successful completion of the workflow Approval workflows sometimes requires more complex criteria, the recommendation if that is the case is that the meta data triggering such criteria is automatically populated, this can be achieved through many approaches including Content Profiles Criteria workflows are configured and managed in WebCenter Content Administration Applets where you can configure one or more workflows. When you configured Criteria workflows the Content Presenter will support the editors with the approval process directly inline in the "Contribution mode" of the portal. In addition to approve/reject and details of the task, the content presenter natively support the user to view the current and future version of the change he/she is approving. See below for example: Architectural recommendation To support review&approval processes - minimize the amount of data files per page Each CMIS query can consume significant time depending on the complexity of the query - minimize the amount of CMIS queries per page Use Content Presenter Templates based on ADF - this way you minimize the design considerations and optimize the usage of caching Implement the page in as few Data files as possible - simplifies publishing process, increases performance and simplifies release process Named data file (node) or list of named nodes when integrating to pages increases performance vs. querying for data Named data file (node) or list of named nodes when integrating to pages enables business centric page creation and publishing and reduces the need for IT department interaction Summary Just because one architectural decision solves a business problem it doesn't mean its the right one, when designing portals all architecture has to be in harmony and not impacting each other. For instance the most technical complex solution is not always the best since it will most likely defeat the business accessibility, performance or both, therefore the best approach is to first design for simplicity that even a non-technical user can operate, after that consider the performance impact and final look at the technology challenges these brings and workaround them first with out-of-the-box features, after that design and develop functions to complement the short comings.

    Read the article

  • Fail to install eclipse-cdt on ubuntu 11.10 for ARM panda board

    - by Jiangning
    I failed to install install eclipse-cdt on ubuntu 11.10 for ARM panda board with the command line below, sudo apt-get install eclipse-cdt Tracing the problem, I find the root cause is eclipse-rcp : Depends: libequinox-osgi-java (= 3.5.2-11ubuntu3) but 3.7.0-0ubuntu1 is to be installed Actually, I can't find this version of libequinox-osgi-java package at all in apt-get for ARM. So how to get it installed? Thanks, -Jiangning

    Read the article

  • Silverlight Cream for March 06, 2010 -- #808

    - by Dave Campbell
    In this Issue: András Velvárt, felix corke, Colin Eberhardt, Christopher Bennage, Gergely Orosz, Entity Spaces Team Blog, Mike Taulty(-2-), Jit Ghosh, and Jesse Liberty. Shoutouts: Jeremy Likness expands on the Silverlight Team's post Vancouver Olympics - How'd We Do That? Gavin Wignall has a post up Creating a 360 photograph of an object with Silverlight Photosynth From SilverlightCream.com: Transforming an Ugly Duckling into a Graceful Swan With Expression Blend and Silverlight - Part 2 Intro Animation András Velvárt has part 2 of his Transformation series up at SilverlightShow... he's taking the initro animation to a new length, allowing playback even... cool video tutorial! Free Silverlight 4 beta skin! felix corke has a Silerlight 4 theme up for us all to use. If you like a dark theme like Blend, you'll like this... I like it! Linq to Visual Tree Colin Eberhardt has a great tutorial up for using LINQ to query the WPF or Silverlight Visual Tree while retaining the tree structure. He also has links out to other techniques. XAML Attributes on Separate Lines Christopher Bennage has a post up showing how to easily get all your XAML attributes on separate lines using a VS menu option... I didn't know that! Using built-in, embedded and streamed fonts in Silverlight Gergely Orosz has a post up at ScottLogic going over Fonts in Silverlight -- built-in, embedded, or streamed, and examples with code. EntitySpaces 2010 Two Part Series on Silverlight and WCF Entity Spaces Team Blog has a pair of videos up on Entity Spaces 2010, WCF, and Silverlight. Part 1 is the intro and explanation, part 2 is a full-up app demonstrating it. MEF, Silverlight and the DeploymentCatalog In an attempt to respond fully to a query, Mike Taulty literally pushed the record button and took off on what became a tutorial video on building a real Silverlight app utilizing MEF. Silverlight 4, Experiment with Pluggable Navigation and a WCF Data Service Mike Taulty has an experiment detailed on his blog about pluggable navigation and Silverlight 4. He walks through the history of how we got to this point then takes on in an example... good external links too Enhancing Silverlight Video Experiences with Contextual Data This is a post on the MSDN Magazine site where Jit Ghosh has a great long post about not only Smooth Streaming with Silverlight, but also adding context data to your video. When Is It OK To Hack? Read what all Jesse Liberty gets involved in when he's trying to get something out the door and has to work around a problem. Just about as interesting are the comments ... check it out and leave your own! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    MIX10

    Read the article

  • SQL Windowing screencast session for Cuppa Corner - rolling totals, data cleansing

    - by tonyrogerson
    In this 10 minute screencast I go through the basics of what I term windowing, which is basically the technique of filtering to a set of rows given a specific value, for instance a Sub-Query that aggregates or a join that returns more than just one row (for instance on a one to one relationship). http://sqlserverfaq.com/content/SQL-Basic-Windowing-using-Joins.aspx SQL below... USE tempdb go CREATE TABLE RollingTotals_Nesting ( client_id int not null, transaction_date date not null, transaction_amount...(read more)

    Read the article

  • Website Health Check - Keyword Blunders - Part 2

    Website Health Check is becoming an integral part of Search Engine Optimization (SEO). The reason is that it helps to find mistakes in websites that are commonly unnoticed. Eventually, it means the difference between coming up as the first or last result in a search engine query. A small part regarding keywords, such as keyword density, keyword stuffing, spelling errors, cloaking, etc., is explained here.

    Read the article

  • Compiling SDL under Windows with sdl-config

    - by DarrenVortex
    I have downloaded NXEngine (The Open Source version of Cave Story). I have a make file in the directory, which I execute using msys. However, the make file uses sdl-config: g++ -g -O2 -c main.cpp -D DEBUG `sdl-config --cflags` -Wreturn-type -Wformat -Wno-multichar -o main.o /bin/sh: sdl-config: command not found And apparently sdl-config does not exist under windows since there's no sdl installation. There's also no documentation on the official sourceforge website about this! What do I do?

    Read the article

  • How to use Secure Erase and is it on the install CD?

    - by Mikey
    Supposedly there is some built in hard drive magic called "Secure Erase" which is wildly faster and more secure than "dd if=/dev/zero..." I am most excited about the speed increase. There seems to be a GUI for it as part of Parted Magic: http://www.ocztechnologyforum.com/forum/showthread.php?81321-Secure-Erase-With-bootable-CD-USB-Linux..-Point-and-Click-Method Is there something like this for Ubuntu? Better yet, is there a way to actually issue this command "manually" like with smartctl or something?

    Read the article

  • TSQL Challenge 28 - SELECT TOP N articles from each category from a SQL Server 2000 database

    The challenge is to write a query that returns the articles to be displayed in the home page of the website. N number of articles from each category is to be selected where N is configured in the Categories table. Each category should select the most recent N articles. ArticleID can be used to identify the most recent articles. An article with a higher number indicates a more recent article.

    Read the article

  • Some date math fun

    - by drsql
    Note: This concept presented is pretty simple and I am not claiming that I am the first to do this… so if you were the one who came up with this, let me know and I will give you linkage Today I was needing to get the data for the current month for a query, and it hit me that I really didn’t have a good way to do this.  There were two common methods that people use: WHERE YEAR(MonthColumn) = YEAR(Getdate()) AND MONTH(MonthColumn) = MONTH(Getdate()) But this particular method is pretty horrible...(read more)

    Read the article

  • Some date math fun

    - by drsql
    Note: This concept presented is pretty simple and I am not claiming that I am the first to do this… so if you were the one who came up with this, let me know and I will give you linkage Today I was needing to get the data for the current month for a query, and it hit me that I really didn’t have a good way to do this.  There were two common methods that people use: WHERE YEAR(MonthColumn) = YEAR(Getdate()) AND MONTH(MonthColumn) = MONTH(Getdate()) But this particular method is pretty horrible...(read more)

    Read the article

  • Discoverer 11.1.1.4 Certified with E-Business Suite

    - by Steven Chan
    Oracle Business Intelligence Discoverer is an ad-hoc query, reporting, analysis, and Web-publishing tool that allows end-users to work directly with Oracle E-Business Suite OLTP data.Discoverer 11g (11.1.1.4) is now certified with Oracle E-Business Suite Release.  Discoverer 11.1.1.4 is part of Oracle Fusion Middleware 11g Release 1 Version 11.1.1.4.0, also known as FMW 11g Patchset 3.  Certified E-Business Suite releases are:EBS Release 11i 11.5.10.2 + ATG RUP 7 and higherEBS Release 12.0.6 and higherEBS Release 12.1.1 and higher

    Read the article

  • NoSQL is not about object databases

    - by Bertrand Le Roy
    NoSQL as a movement is an interesting beast. I kinda like that it’s negatively defined (I happen to belong myself to at least one other such a-community). It’s not in its roots about proposing one specific new silver bullet to kill an old problem. it’s about challenging the consensus. Actually, blindly and systematically replacing relational databases with object databases would just replace one set of issues with another. No, the point is to recognize that relational databases are not a universal answer -although they have been used as one for so long- and recognize instead that there’s a whole spectrum of data storage solutions out there. Why is it so hard to recognize, by the way? You are already using some of those other data storage solutions every day. Let me cite a few: The file system Active Directory XML / JSON documents The Web e-mail Logs Excel files EXIF blobs in your photos Relational databases And yes, object databases It’s just a fact of modern life. Notice by the way that most of the data that you use every day is unstructured and thus mostly unsuitable for relational storage. It really is more a matter of recognizing it: you are already doing NoSQL. So what happens when for any reason you need to simultaneously query two or more of these heterogeneous data stores? Well, you build an index of sorts combining them, and that’s what you query instead. Of course, there’s not much distance to travel from that to realizing that querying is better done when completely separated from storage. So why am I writing about this today? Well, that’s something I’ve been giving lots of thought, on and off, over the last ten years. When I built my first CMS all that time ago, one of the main problems my customers were facing was to manage and make sense of the mountain of unstructured data that was constituting most of their business. The central entity of that system was the file system because we were dealing with lots of Word documents, PDFs, OCR’d articles, photos and static web pages. We could have stored all that in SQL Server. It would have worked. Ew. I’m so glad we didn’t. Today, I’m working on Orchard (another CMS ;). It’s a pretty young project but already one of the questions we get the most is how to integrate existing data. One of the ideas I’ll be trying hard to sell to the rest of the team in the next few months is to completely split the querying from the storage. Not only does this provide great opportunities for performance optimizations, it gives you homogeneous access to heterogeneous and existing data sources. For free.

    Read the article

  • Gain Quick Access to the Cache in Firefox

    - by Asian Angel
    Are you looking for a quick and simple way to view the contents of the cache in Firefox? Then you will definitely want to see how easy it can be using the CacheViewer extension. Note: CacheViewer is a front-end app for easily accessing and searching the memory cache. Before Viewing the cache in Firefox using “about:cache” provides some information about the contents but may not be the most efficient method available for some people. CacheViewer in Action Once you have installed the extension there are three easy ways to access your new cache viewer. The first is using the “CacheViewer Command” available in the “Tools Menu” and the second is using the keyboard shortcut “Ctrl + Shift + C”. The third way is by adding a “Toolbar Button” to your browser’s UI. All three work equally well…choose the method that best suits your personal needs. When you access the “CacheViewer Window” this is what it will look like. You may decide to resize it and move (or hide) some of the columns for the best viewing. You can easily scroll through the cache contents and preview images if desired as shown here. If you keep the “CacheViewer Window” open you can refresh it as you browse using the “Refresh Button” in the lower right corner. This is a nice, quick, and very simple way to access the cache on demand and save items to your hard-drive if desired. Note: The “CacheViewer” can also be set to open in a new tab instead (see “Options”). Options Choose whether “CacheViewer” opens in a separate window (default) or in a new tab. Conclusion If you want a quick and simple way to view the cache in Firefox then the CacheViewer extension is just what you have been looking for. Link Download the CacheViewer extension (Mozilla Add-ons) Similar Articles Productive Geek Tips Add a Cache Clearing Button to FirefoxSearch for Install Packages from the Ubuntu Command LineQuick Tip: Empty Internet Explorer 7 Cache when Browser is ClosedView Internet Explorer Cache Files the Easy WayQuick Hits: 11 Firefox Tab How-Tos TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Out of band Security Update for Internet Explorer 7 Cool Looking Screensavers for Windows SyncToy syncs Files and Folders across Computers on a Network (or partitions on the same drive) If it were only this easy Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook

    Read the article

  • Create Custom Sized Thumbnail Images with Simple Image Resizer [Cross-Platform]

    - by Asian Angel
    Are you looking for an easy way to create custom sized thumbnail images for use in blog posts, photo albums, and more? Whether is it a single image or a CD full, Simple Image Resizer is the right app to get the job done for you. To add the new PPA for Simple Image Resizer open the Ubuntu Software Center, go to the Edit Menu, and select Software Sources. Access the Other Software Tab in the Software Sources Window and add the first of the PPAs shown below (outlined in red). The second PPA will be automatically added to your system. Once you have the new PPAs set up, go back to the Ubuntu Software Center and click on the PPA listing for Rafael Sachetto on the left (highlighted with red in the image). The listing for Simple Image Resizer will be right at the top…click Install to add the program to your system. After the installation is complete you can find Simple Image Resizer listed as Sir in the Graphics sub-menu. When you open Simple Image Resizer you will need to browse for the directory containing the images you want to work with, select a destination folder, choose a target format and prefix, enter the desired pixel size for converted images, and set the quality level. Convert your image(s) when ready… Note: You will need to determine the image size that best suits your needs before-hand. For our example we chose to convert a single image. A quick check shows our new “thumbnailed” image looking very nice. Simple Image Resizer can convert “into and from” the following image formats: .jpeg, .png, .bmp, .gif, .xpm, .pgm, .pbm, and .ppm Command Line Installation Note: For older Ubuntu systems (9.04 and previous) see the link provided below. sudo add-apt-repository ppa:rsachetto/ppa sudo apt-get update && sudo apt-get install sir Links Note: Simple Image Resizer is available for Ubuntu, Slackware Linux, and Windows. Simple Image Resizer PPA at Launchpad Simple Image Resizer Homepage Command Line Installation for Older Ubuntu Systems Bonus The anime wallpaper shown in the screenshots above can be found here: The end where it begins [DesktopNexus] Latest Features How-To Geek ETC Macs Don’t Make You Creative! So Why Do Artists Really Love Apple? MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Create Shortcuts for Your Favorite or Most Used Folders in Ubuntu Create Custom Sized Thumbnail Images with Simple Image Resizer [Cross-Platform] Etch a Circuit Board using a Simple Homemade Mixture Sync Blocker Stops iTunes from Automatically Syncing The Journey to the Mystical Forest [Wallpaper] Trace Your Browser’s Roots on the Browser Family Tree [Infographic]

    Read the article

  • Run simple bash script to start applications at login

    - by ganjan
    I want to run a simple bash script automatically when I log in. For example #!/bin/bash echo "start spotify" gnome-terminal -e spotify --title spotify When I run this command, one gnome-terminal shows up and spotify show up. I also want the gnome-terminal to popup "hidden" in a different virtual desktop. (one of the other four virtual desktops you can choose from taskbar) I tried to add this to /home/me/.bash_login or something, but that didn't work..

    Read the article

< Previous Page | 665 666 667 668 669 670 671 672 673 674 675 676  | Next Page >