Search Results

Search found 36521 results on 1461 pages for 'aq advanced queue oracle support streams propagation schedule dblink troubleshoo'.

Page 661/1461 | < Previous Page | 657 658 659 660 661 662 663 664 665 666 667 668  | Next Page >

  • Removing the "Cannot find a skin that matches family portal and version v1.1" message

    - by Maiko Rocha
    Do you get annoyed by the following message on your weblogic log output? <SkinFactoryImpl> <getSkin> Cannot find a skin that matches family portal and version v1.1. We will use the skin portal.desktop. Yes? Well, me too :-). To get rid of it just open your portal application's trinidad-config.xml file and remove the <skin-version> element from it. Before: <?xml version="1.0" encoding="UTF-8"?> <trinidad-config xmlns="http://myfaces.apache.org/trinidad/config">   <skin-family>#{preferenceBean.defaultTrinidadSkin}</skin-family> <skin-version>v1.1</skin-version> </trinidad-config> After: <?xml version="1.0" encoding="UTF-8"?> <trinidad-config xmlns="http://myfaces.apache.org/trinidad/config">   <skin-family>#{preferenceBean.defaultTrinidadSkin}</skin-family> </trinidad-config>

    Read the article

  • Interviews Gone Bad.....Now What Do I Do?

    - by david.talamelli
    We have all done it at some stage of our working careers - you know those times when you leave an interview and then you think to yourself "why didn't I ask that question" or "I can't believe I said that" or "how could I have forgotten to say that". It happens to everyone but how you handle things moving forwards could be critical in helping you land that dream job. There is nothing better than seeing that dream job with the dream company that you are looking to work for advertised (or in some cases getting called by the Recruiter to let you know about that job). The role may seem perfect and it could be just what you are looking for and it is with the right company as well. You have sent in your resume and have subsequently had one, two or maybe three interviews for the role. After each step of the process you get a little bit more excited about the role as you start to think about your work day in your new role/company. Then it happens, you get it: you get The Phone Call to inform you that you have not been successful in securing the position that you have invested so much time and effort into. It can be disappointing to hear this news but what you do next is important in potentially keeping that door open for future opportunities with that company. How you handle yourself in this situation is important: if any of you remember the Choose Your Own Adventure Books do you: Tell the Recruiter (maybe get aggressive) they are wrong in their assessment and that you are the right candidate for the role Switch off and say ok thanks and hang up without engaging in any further dialogue Thank the company for their time and enquire if there may be any other opportunities in the future to explore If you chose the first option - the company in question may consider whether or not to look at you for other opportunities. How you handle yourself in the recruitment process could be an indication of how you would deal with clients/colleagues in your role and the impression that you leave a potential employer may be what sticks in their mind when they think of you (eg: isn't that the person who couldn't handle it when we told him he wasn't right for our role). The second option potentially produces a similar outcome. If you rush to get off the phone, the company may come back to you to talk about other roles when they come up, but you also leave open the potential thought with the company you were only interested in that role and therefore not interested in any other opportunities. Why take the risk of the company thinking that and potentially not getting back to you in the future. By picking the third option, you actively engage with the company and keep the dialogue open for future discussions. Ok, so you didn't get the role you interviewed for - you don't know who else the company may have been interviewing - maybe they found someone who was a better fit, or maybe there were too many boxes you didn't tick to step straight into that specific role. Take a deep breath and keep the company engaged. You are fresh in their mind - take advantage of that fact and let them know that while you respect their decision, that you are still interested in the company and would like to be kept in mind for future roles. Ask if it is ok to keep in touch and when they would like to keep in touch, as long as you are interested let them know you are still interested. You do need to balance that though if you come across as too keen or start stalking people - it could equally damage your brand. Companies normally have more than one open role. New roles are created all the time, circumstances change and hiring people is not a static business, it changes course from everyone's best laid initial plans. If you didn't get that initial role you wanted, keep the door open with that company so that when those new roles do come up or when circumstances do change you have already laid the ground to step into those new positions.

    Read the article

  • FIFA EM 2012: Tippspiel Anwendung von Christian Rokitta

    - by carstenczarski
    Sie sind nicht nur APEX-Entwickler, sondern auch Fußballfan ...? Dann ist die EURO2012 Tippspiel-Anwendung (natürlich mit APEX entwickelt) von Christian Rokitta genau das Richtige für Sie. Aber auch alle anderen finden hier eine APEX-Anwendung, welche die Möglichkeiten für ein APEX-Anwendungslayout eindrucksvoll vor Augen führt. Es hat doch wesentlich mehr drin, als die mitgelieferten Templates anbieten.

    Read the article

  • OWB 11gR2 &ndash; Degenerate Dimensions

    - by David Allan
    Ever wondered how to build degenerate dimensions in OWB and get the benefits of slowly changing dimensions and cube loading? Now its possible through some changes in 11gR2 to make the dimension and cube loading much more flexible. This will let you get the benefits of OWB's surrogate key handling and slowly changing dimension reference when loading the fact table and need degenerate dimensions (see Ralph Kimball's degenerate dimensions design tip). Here we will see how to use the cube operator to load slowly changing, regular and degenerate dimensions. The cube and cube operator can now work with dimensions which have no surrogate key as well as dimensions with surrogates, so you can get the benefit of the cube loading and incorporate the degenerate dimension loading. What you need to do is create a dimension in OWB that is purely used for ETL metadata; the dimension itself is never deployed (its table is, but has not data) it has no surrogate keys has a single level with a business attribute the degenerate dimension data and a dummy attribute, say description just to pass the OWB validation. When this degenerate dimension is added into a cube, you will need to configure the fact table created and set the 'Deployable' flag to FALSE for the foreign key generated to the degenerate dimension table. The degenerate dimension reference will then be in the cube operator and used when matching. Create the degenerate dimension using the regular wizard. Delete the Surrogate ID attribute, this is not needed. Define a level name for the dimension member (any name). After the wizard has completed, in the editor delete the hierarchy STANDARD that was automatically generated, there is only a single level, no need for a hierarchy and this shouldn't really be created. Deploy the implementing table DD_ORDERNUMBER_TAB, this needs to be deployed but with no data (the mapping here will do a left outer join of the source data with the empty degenerate dimension table). Now, go ahead and build your cube, use the regular TIMES dimension for example and your degenerate dimension DD_ORDERNUMBER, can add in SCD dimensions etc. Configure the fact table created and set Deployable to false, so the foreign key does not get generated. Can now use the cube in a mapping and load data into the fact table via the cube operator, this will look after surrogate lookups and slowly changing dimension references.   If you generate the SQL you will see the ON clause for matching includes the columns representing the degenerate dimension columns. Here we have seen how this use case for loading fact tables using degenerate dimensions becomes a whole lot simpler using OWB 11gR2. I'm sure there are other use cases where using this mix of dimensions with surrogate and regular identifiers is useful, Fact tables partitioned by date columns is another classic example that this will greatly help and make the cube operator much more useful. Good to hear any comments.

    Read the article

  • Observations in Migrating from JavaFX Script to JavaFX 2.0

    - by user12608080
    Observations in Migrating from JavaFX Script to JavaFX 2.0 Introduction Having been available for a few years now, there is a decent body of work written for JavaFX using the JavaFX Script language. With the general availability announcement of JavaFX 2.0 Beta, the natural question arises about converting the legacy code over to the new JavaFX 2.0 platform. This article reflects on some of the observations encountered while porting source code over from JavaFX Script to the new JavaFX API paradigm. The Application The program chosen for migration is an implementation of the Sudoku game and serves as a reference application for the book JavaFX – Developing Rich Internet Applications. The design of the program can be divided into two major components: (1) A user interface (ideally suited for JavaFX design) and (2) the puzzle generator. For the context of this article, our primary interest lies in the user interface. The puzzle generator code was lifted from a sourceforge.net project and is written entirely in Java. Regardless which version of the UI we choose (JavaFX Script vs. JavaFX 2.0), no code changes were required for the puzzle generator code. The original user interface for the JavaFX Sudoku application was written exclusively in JavaFX Script, and as such is a suitable candidate to convert over to the new JavaFX 2.0 model. However, a few notable points are worth mentioning about this program. First off, it was written in the JavaFX 1.1 timeframe, where certain capabilities of the JavaFX framework were as of yet unavailable. Citing two examples, this program creates many of its own UI controls from scratch because the built-in controls were yet to be introduced. In addition, layout of graphical nodes is done in a very manual manner, again because much of the automatic layout capabilities were in flux at the time. It is worth considering that this program was written at a time when most of us were just coming up to speed on this technology. One would think that having the opportunity to recreate this application anew, it would look a lot different from the current version. Comparing the Size of the Source Code An attempt was made to convert each of the original UI JavaFX Script source files (suffixed with .fx) over to a Java counterpart. Due to language feature differences, there are a small number of source files which only exist in one version or the other. The table below summarizes the size of each of the source files. JavaFX Script source file Number of Lines Number of Character JavaFX 2.0 Java source file Number of Lines Number of Characters ArrowKey.java 6 72 Board.fx 221 6831 Board.java 205 6508 BoardNode.fx 446 16054 BoardNode.java 723 29356 ChooseNumberNode.fx 168 5267 ChooseNumberNode.java 302 10235 CloseButtonNode.fx 115 3408 CloseButton.java 99 2883 ParentWithKeyTraversal.java 111 3276 FunctionPtr.java 6 80 Globals.java 20 554 Grouping.fx 8 140 HowToPlayNode.fx 121 3632 HowToPlayNode.java 136 4849 IconButtonNode.fx 196 5748 IconButtonNode.java 183 5865 Main.fx 98 3466 Main.java 64 2118 SliderNode.fx 288 10349 SliderNode.java 350 13048 Space.fx 78 1696 Space.java 106 2095 SpaceNode.fx 227 6703 SpaceNode.java 220 6861 TraversalHelper.fx 111 3095 Total 2,077 79,127 2531 87,800 A few notes about this table are in order: The number of lines in each file was determined by running the Unix ‘wc –l’ command over each file. The number of characters in each file was determined by running the Unix ‘ls –l’ command over each file. The examination of the code could certainly be much more rigorous. No standard formatting was performed on these files.  All comments however were deleted. There was a certain expectation that the new Java version would require more lines of code than the original JavaFX script version. As evidenced by a count of the total number of lines, the Java version has about 22% more lines than its FX Script counterpart. Furthermore, there was an additional expectation that the Java version would be more verbose in terms of the total number of characters.  In fact the preceding data shows that on average the Java source files contain fewer characters per line than the FX files.  But that's not the whole story.  Upon further examination, the FX Script source files had a disproportionate number of blank characters.  Why?  Because of the nature of how one develops JavaFX Script code.  The object literal dominates FX Script code.  Its not uncommon to see object literals indented halfway across the page, consuming lots of meaningless space characters. RAM consumption Not the most scientific analysis, memory usage for the application was examined on a Windows Vista system by running the Windows Task Manager and viewing how much memory was being consumed by the Sudoku version in question. Roughly speaking, the FX script version, after startup, had a RAM footprint of about 90MB and remained pretty much the same size. The Java version started out at about 55MB and maintained that size throughout its execution. What About Binding? Arguably, the most striking observation about the conversion from JavaFX Script to JavaFX 2.0 concerned the need for data synchronization, or lack thereof. In JavaFX Script, the primary means to synchronize data is via the bind expression (using the “bind” keyword), and perhaps to a lesser extent it’s “on replace” cousin. The bind keyword does not exist in Java, so for JavaFX 2.0 a Data Binding API has been introduced as a replacement. To give a feel for the difference between the two versions of the Sudoku program, the table that follows indicates how many binds were required for each source file. For JavaFX Script files, this was ascertained by simply counting the number of occurrences of the bind keyword. As can be seen, binding had been used frequently in the JavaFX Script version (and does not take into consideration an additional half dozen or so “on replace” triggers). The JavaFX 2.0 program achieves the same functionality as the original JavaFX Script version, yet the equivalent of binding was only needed twice throughout the Java version of the source code. JavaFX Script source file Number of Binds JavaFX Next Java source file Number of “Binds” ArrowKey.java 0 Board.fx 1 Board.java 0 BoardNode.fx 7 BoardNode.java 0 ChooseNumberNode.fx 11 ChooseNumberNode.java 0 CloseButtonNode.fx 6 CloseButton.java 0 CustomNodeWithKeyTraversal.java 0 FunctionPtr.java 0 Globals.java 0 Grouping.fx 0 HowToPlayNode.fx 7 HowToPlayNode.java 0 IconButtonNode.fx 9 IconButtonNode.java 0 Main.fx 1 Main.java 0 Main_Mobile.fx 1 SliderNode.fx 6 SliderNode.java 1 Space.fx 0 Space.java 0 SpaceNode.fx 9 SpaceNode.java 1 TraversalHelper.fx 0 Total 58 2 Conclusions As the JavaFX 2.0 technology is so new, and experience with the platform is the same, it is possible and indeed probable that some of the observations noted in the preceding article may not apply across other attempts at migrating applications. That being said, this first experience indicates that the migrated Java code will likely be larger, though not extensively so, than the original Java FX Script source. Furthermore, although very important, it appears that the requirements for data synchronization via binding, may be significantly less with the new platform.

    Read the article

  • E 2.0 Value Metaphors

    - by Tom Tonkin
    I guess I have been doing this too long. I can easily see the value of Enterprise 2.0 technology for an organization, but find it a challenge at times to convey that same value to others. I also know that I'm not the only one that has that issue. Others, that have that same passion, also suffer from being, perhaps, too close to the market. I was having this same discussion with a few colleagues when one of them suggested that metaphors might be a good vehicle to communicate the value to those that are not as familiar.  One such metaphor was discussed.Apparently,back in the early 50's, there was a great Air Force aviator and military strategist by the name of John Boyd.  Without going into a ton of detail (you can search him on the internet), what made Colonel Boyd great was that he never lost a dog fight.  As a matter of fact, they called him 'Forty-Second Boyd' since he claimed to be able to beat anyone in any type of aircraft in less than forty seconds, even if his aircraft was inferior to his opponents.His approach as was unique.  He observed over time that there was a pattern on how aviators  engaged in a dogfight.  He called this method OODA.   It describes how a person or, in our case, an organization, would react to an event.  OODA is an acrostic for Observation, Orientation, Decision and Action.  Again, there is a lot more on the internet about this.A pilot would go through this loop several times during a dogfight and Boyd would try to predict this loop and interrupt it by changing the landscape of the actual dogfight.  This would give Boyd an advantage and be able to predict what his opponent would do and then counterattack.Boyd went on to say that many companies have a similar reaction loop and that by understanding that loop, organizations would be able to adjust better to market conditions, predict what the competition is doing and reposition themselves to gain competitive advantages. So, our metaphor would be that Enterprise 2.0 provides companies greater visibility of their business by connecting to employees, customers and partners in a collaborative fashion.  This, in turn, helps them navigate through the tough times and provide lines of sight to more innovative ideas.  Innovation is that last tool for companies to achieve competitive advantage (maybe a discusion for another post).Perhaps this is more wordy than some other metaphor, but it does allow for an interesting  dialogue to start and maybe even a framwork to fullfill the promise of E 2.0. So, I'm sure there are many more metaphors for the value that E 2.0 brings to organzaitons. Do you have one to share? Please comment below and thanks for stopping by.

    Read the article

  • The ugly evolution of running a background operation in the context of an ASP.NET app

    - by Jeff
    If you’re one of the two people who has followed my blog for many years, you know that I’ve been going at POP Forums now for over almost 15 years. Publishing it as an open source app has been a big help because it helps me understand how people want to use it, and having it translated to six languages is pretty sweet. Despite this warm and fuzzy group hug, there has been an ugly hack hiding in there for years. One of the things we find ourselves wanting to do is hide some kind of regular process inside of an ASP.NET application that runs periodically. The motivation for this has always been that a lot of people simply don’t have a choice, because they’re running the app on shared hosting, or don’t otherwise have access to a box that can run some kind of regular background service. In POP Forums, I “solved” this problem years ago by hiding some static timers in an HttpModule. Truthfully, this works well as long as you don’t run multiple instances of the app, which in the cloud world, is always a possibility. With the arrival of WebJobs in Azure, I’m going to solve this problem. This post isn’t about that. The other little hacky problem that I “solved” was spawning a background thread to queue emails to subscribed users of the forum. This evolved quite a bit over the years, starting with a long running page to mail users in real-time, when I had only a few hundred. By the time it got into the thousands, or tens of thousands, I needed a better way. What I did is launched a new thread that read all of the user data in, then wrote a queued email to the database (as in, the entire body of the email, every time), with the properly formatted opt-out link. It was super inefficient, but it worked. Then I moved my biggest site using it, CoasterBuzz, to an Azure Website, and it stopped working. So let’s start with the first stupid thing I was doing. The new thread was simply created with delegate code inline. As best I can tell, Azure Websites are more aggressive about garbage collection, because that thread didn’t queue even one message. When the calling server response went out of scope, so went the magic background thread. Duh, all I had to do was move the thread to a private static variable in the class. That’s the way I was able to keep stuff running from the HttpModule. (And yes, I know this is still prone to failure, particularly if the app recycles. For as infrequently as it’s used, I have not, however, experienced this.) It was still failing, but this time I wasn’t sure why. It would queue a few dozen messages, then die. Running in Azure, I had to turn on the application logging and FTP in to see what was going on. That led me to a helper method I was using as delegate to build the unsubscribe links. The idea here is that I didn’t want yet another config entry to describe the base URL, appended with the right path that would match the routing table. No, I wanted the app to figure it out for you, so I came up with this little thing: public static string FullUrlHelper(this Controller controller, string actionName, string controllerName, object routeValues = null) { var helper = new UrlHelper(controller.Request.RequestContext); var requestUrl = controller.Request.Url; if (requestUrl == null) return String.Empty; var url = requestUrl.Scheme + "://"; url += requestUrl.Host; url += (requestUrl.Port != 80 ? ":" + requestUrl.Port : ""); url += helper.Action(actionName, controllerName, routeValues); return url; } And yes, that should have been done with a string builder. This is useful for sending out the email verification messages, too. As clever as I thought I was with this, I was using a delegate in the admin controller to format these unsubscribe links for tens of thousands of users. I passed that delegate into a service class that did the email work: Func<User, string> unsubscribeLinkGenerator = user => this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = user.UserID, key = _profileService.GetUnsubscribeHash(user) }); _mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator); Cool, right? Actually, not so much. If you look back at the helper, this delegate then will depend on the controller context to learn the routing and format for the URL. As you might have guessed, those things were turning null after a few dozen formatted links, when the original request to the admin controller went away. That this wasn’t already happening on my dedicated server is surprising, but again, I understand why the Azure environment might be eager to reclaim a thread after servicing the request. It’s already inefficient that I’m building the entire email for every user, but going back to check the routing table for the right link every time isn’t a win either. I put together a little hack to look up one generic URL, and use that as the basis for a string format. If you’re wondering why I didn’t just use the curly braces up front, it’s because they get URL formatted: var baseString = this.FullUrlHelper("Unsubscribe", AccountController.Name, new { id = "--id--", key = "--key--" }); baseString = baseString.Replace("--id--", "{0}").Replace("--key--", "{1}"); Func unsubscribeLinkGenerator = user => String.Format(baseString, user.UserID, _profileService.GetUnsubscribeHash(user)); _mailingListService.MailUsers(subject, body, htmlBody, unsubscribeLinkGenerator); And wouldn’t you know it, the new solution works just fine. It’s still kind of hacky and inefficient, but it will work until this somehow breaks too.

    Read the article

  • Excel Template Teaser

    - by Tim Dexter
    In lieu of some official documentation I'm in the process of putting together some posts on the new 10.1.3.4.1 Excel templates. No more HTML, maskerading as Excel; far more flexibility than Excel Analyzer and no need to write complex XSL templates to create the same output. Multi sheet outputs with macros and embeddable XSL commands are here. Their capabilities are pretty extensive and I have not worked on them for a few years since I helped put them together for EBS FSG users, so Im back on the learning curve. Let me say up front, there is no template builder, its a completely manual process to build them but, the results can be fantastic and provide yet another 'superstar' opportunity for you. The templates can take hierarchical XML data and walk the structure much like an RTF template. They use named cells/ranges and a hidden sheet to provide the rendering engine the hooks to drop the data in. As a taster heres the data and output I worked with on my first effort: <EMPLOYEES> <LIST_G_DEPT> <G_DEPT> <DEPARTMENT_ID>10</DEPARTMENT_ID> <DEPARTMENT_NAME>Administration</DEPARTMENT_NAME> <LIST_G_EMP> <G_EMP> <EMPLOYEE_ID>200</EMPLOYEE_ID> <EMP_NAME>Jennifer Whalen</EMP_NAME> <EMAIL>JWHALEN</EMAIL> <PHONE_NUMBER>515.123.4444</PHONE_NUMBER> <HIRE_DATE>1987-09-17T00:00:00.000-06:00</HIRE_DATE> <SALARY>4400</SALARY> </G_EMP> </LIST_G_EMP> <TOTAL_EMPS>1</TOTAL_EMPS> <TOTAL_SALARY>4400</TOTAL_SALARY> <AVG_SALARY>4400</AVG_SALARY> <MAX_SALARY>4400</MAX_SALARY> <MIN_SALARY>4400</MIN_SALARY> </G_DEPT> ... </LIST_G_DEPT> </EMPLOYEES> Structured XML coming from a data template, check out the data template progression post. I can then generate the following binary XLS file. There are few cool things to notice in this output. DEPARTMENT-EMPLOYEE master detail output. Not easy to do in the Excel analyzer. Date formatting - this is using an Excel function. Remember BIP generates XML dates in the canonical format. I have formatted the other data in the template using native Excel functionality Salary Total - although in the data I have calculated this in the template Conditional formatting - this is handled by Excel based on the incoming data Bursting department data across sheets and using the department name for the sheet name. This alone is worth the wait! there's more, but this is surely enough to whet your appetite. These new templates are already tucked away in EBS R12 under controlled release by the GL team and have now come to the BIEE and standalone releases in the 10.1.3.4.1+ rollup patch. For the rest of you, its going to be a bit of a waiting game for the relevant teams to uptake the latest BIP release. Look out for more soon with some explanation of how they work and how to put them together!

    Read the article

  • You Can't Win on Price

    - by David Dorf
    This year I did the majority of my Christmas shopping from the comfort of my home office. There aren't many things in stores you can't find online these days. I find it easier to search, research, and compare products online rather than walking the mall anyway. But there's a segment of the population that likes to be in the store, touching the products. For those people, smartphones avail them some of the e-commerce features I mentioned right there in the aisles. First it was RedLaser, then TheFind, ShopSavvy and many others. But the one that should be scaring retailers is Amazon's PriceCheck application. It lets you scan the product barcode, take a picture of the product, or speak the product's name. Once the product is identified, it shows the online prices, with Amazon at the top of the list. Within 10 seconds you can order the item and Amazon Prime members get free 2-day shipping too. I don't think fashion and grocery retailers need to worry much, but I have to believe smartphones are helping Amazon win a little more of the brand-name hardgoods market. So what's a retailer to do? Best Buy has begun to put QR Codes on their shelf labels that are easily scanned by smartphones and take the consumer to a Best Buy Web page where they can get extended information about the product. The consumer is getting the additional information they want, and Best Buy avoids the price comparisons. Of course if a consumer chooses to use the Amazon PriceCheck app, then all bets are off. That's when Best Buy has to hope the in-store experience and customer service will save the sale. My point is that the internet makes information available to everyone, and smartphones make it available anywhere. Unless you want your store to be Amazon's local showroom, you need to be price-competitive but differentiate on other aspects of the shopping experience. With the cost of running a physical store, you can't win on price.

    Read the article

  • iPhone Peripherals for Retailers

    - by David Dorf
    I saw RedLaser on the latest "Shopper" iPhone commercial on TV. Works great for consumers, but retailers will be more interested in a true barcode reader from someone like Infinite Peripherals, which also comes with a magstripe reader I previously mentioned the offerings from Square Verifone, and Mophie that allow swiping credit cards with an iPhone as well. So what's next? There's a decent list at WireLust that includes an IR dongle that turns your iPhone into a TV remote, armband monitors for use when exercising, and most recently a NFC/RFID reader. iCarte from Canadian firm Wireless Dynamics looks interesting. This device can be used for NFC payments and for reading RFID tags. The Canon printer I just bought for home has an iPhone app that lets me send iPhone pictures directly to the printer for printing. In that same vein, Seems like retailers could use bluetooth to print receipts on strategically place printers on the floor. I can't wait to see what they come up with for the iPad.

    Read the article

  • Hello World Pagelet

    - by astemkov
    Introduction The goal of this exercise is to give you a basic feel of how you can use Pagelet Producer to proxy a web page We will proxy a simple static Hello World web page, cut one section out of that page and present it as a pagelet that you can later insert on your own application page or to your portal page such as WebCenter Portal space or WebCenter Interaction community page. Hello World sample app This is the static web page we will work with: Let's assume the following: The Hello World web page is running on server http://appserver.company.com:1234/ The Hello World web page path is: http://appserver.company.com:1234/helloworld/ Initial Pagelet Producer setup Let's assume that the Pagelet Producer server is running on http://pageletserver.company.com:8889/pagelets/ First let's check that Pagelet Producer is up and running. In order to do that we just need to access the following URL: http://pageletserver.company.com:8889/pagelets/ And this is what should be returned: Now you can access Pagelet Producer administration screens using this URL: http://pageletserver.company.com:8889/pagelets/admin This is how the UI looks: Now if you connect to the internet via proxy server, you need to configure proxy in Pagelet Producer settings. In the Navigator pane: Jump To - Settings Click on "Proxy" Enter your proxy server configuration: Creating a resource First thing that you need to do is to create a resource for your web page. This will tell Pagelet Producer that all sub-paths of the web page should be proxied. It also will allow you to setup common rules of how your web page should be proxied and will serve as a container for your pagelets. In the Navigator pane: Jump To - Resources Click on any existing resource (ex. welcome_resource) Click on "Create selected type" toolbar button at the top of the Navigator pane Select "Web" in the "Select Producer Type" dialog box and click "OK" Now after the resource is created let's click on "General" sub-item a specify the following values Name = AppServer Source URL = http://appserver.company.com:1234/ Destination URL = /appserver/ Click on "Save" toolbar button at the top of the Navigator pane After the resource is created our web page becomes accessible by the URL: http://pageletserver.company.com:8889/pagelets/appserver/helloworld/ So in original web page address Source URL is replaced with Pagelet Producer URL (http://pageletserver.company.com:8889/pagelets) + Destination URL Creating a pagelet Now let's create "Hello World" pagelet. Under the resource node activate Pagelets subnode Click on "Create selected type" toolbar button at the top of the Navigator pane Click on "General" sub-node of newly created pagelet and specify the following values Name = Hello_World Library = MyLib Library is used for logical grouping. The portals use the "Library" value to group pagelets in their respective UI's. For example, when adding pagelets to a WebCenter Portal space you would see the individual pagelets listed under the "Library" name. URL Suffix = helloworld/index.html this is where the Hello World page html is served from Click on "Save" toolbar button at the top of the Navigator pane The Library name can be anything you want, it doesn't have to match the resource name at all. It is used as a logical grouping of pagelets, and you can include pagelets from multiple resources into the same library or create a new library for each pagelet. After you save the pagelet you can access it here: http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/MyLib/Hello_World which is : http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/ + [Library] + [Name] Or to test the injection of a pagelet into iframe you can click on the pagelets "Documentation" sub-node and use "Access Pagelet using REST" URL: This is what we will see: Clipping The pagelet that we just created covers the whole web page, but we want just the "Hello World" segment of it. So let's clip it. Under the Hello_World pagelet node activate Clipper sub-node Click on "Create selected type" toolbar button at the top of the Navigator pane Specify a Name for newly created clipper. For example: "c1" Click on "Content" sub-node of the clipper Click on "Launch Clipper" button New browser window will open By moving a mouse pointer over the web page select the area you want to clip: Click left mouse button - the browser window will disappear and you will see that Clipping Path was automatically generated Now let's save and access the link from the "Documentation" page again Here's our pagelet nicely clipped and ready for being used on your Web Center Space

    Read the article

  • Rychlejší aplikace i bez zmen dotazu - 2.díl - vliv vázaných promenných

    - by david.krch
    V minulém díle jsme si na vzorovém príkladu vkládání 100.000 záznamu ukázali jak velkou zátež muže pro databázový server znamenat zbytecne casté commitování. Dobu zpracování této operace jsme snížili ze 167 na 105 sekund, tedy o tretinu. Ke slibovanému osmdesátinásobnému zrychlení nám chybí ješte dva kroky. V záveru predchozího dílu jsme zjistili, že parsování (rozbor a optimalizace) dotazu zabralo serveru celých 74 ze zminovaných 105 sekund. Svou pozornost dnes zameríme práve na minimalizaci casu parsování.

    Read the article

  • Delegation of Solaris Zone Administration

    - by darrenm
    In Solaris 11 'Zone Delegation' is a built in feature. The Zones system now uses finegrained RBAC authorisations to allow delegation of management of distinct zones, rather than all zones which is what the 'Zone Management' RBAC profile did in Solaris 10.The data for this can be stored with the Zone or you could also create RBAC profiles (that can even be stored in NIS or LDAP) for granting access to specific lists of Zones to administrators.For example lets say we have zones named zoneA through zoneF and we have three admins alice, bob, carl.  We want to grant a subset of the zone management to each of them.We could do that either by adding the admin resource to the appropriate zones via zonecfg(1M) or we could do something like this with RBAC data directly: First lets look at an example of storing the data with the zone. # zonecfg -z zoneA zonecfg:zoneA> add admin zonecfg:zoneA> set user=alice zonecfg:zoneA> set auths=manage zonecfg:zoneA> end zonecfg:zoneA> commit zonecfg:zoneA> exit Now lets look at the alternate method of storing this directly in the RBAC database, but we will show all our admins and zones for this example: # usermod -P +Zone Management -A +solaris.zone.manage/zoneA alice # usermod -A +solaris.zone.login/zoneB alice # usermod -P +Zone Management-A +solaris.zone.manage/zoneB bob # usermod -A +solaris.zone.manage/zoneC bob # usermod -P +Zone Management-A +solaris.zone.manage/zoneC carl # usermod -A +solaris.zone.manage/zoneD carl # usermod -A +solaris.zone.manage/zoneE carl # usermod -A +solaris.zone.manage/zoneF carl In the above alice can only manage zoneA, bob can manage zoneB and zoneC and carl can manage zoneC through zoneF.  The user alice can also login on the console to zoneB but she can't do the operations that require the solaris.zone.manage authorisation on it.Or if you have a large number of zones and/or admins or you just want to provide a layer of abstraction you can collect the authorisation lists into an RBAC profile and grant that to the admins, for example lets great an RBAC profile for the things that alice and carl can do. # profiles -p 'Zone Group 1' profiles:Zone Group 1> set desc="Zone Group 1" profiles:Zone Group 1> add profile="Zone Management" profiles:Zone Group 1> add auths=solaris.zone.manage/zoneA profiles:Zone Group 1> add auths=solaris.zone.login/zoneB profiles:Zone Group 1> commit profiles:Zone Group 1> exit # profiles -p 'Zone Group 3' profiles:Zone Group 1> set desc="Zone Group 3" profiles:Zone Group 1> add profile="Zone Management" profiles:Zone Group 1> add auths=solaris.zone.manage/zoneD profiles:Zone Group 1> add auths=solaris.zone.manage/zoneE profiles:Zone Group 1> add auths=solaris.zone.manage/zoneF profiles:Zone Group 1> commit profiles:Zone Group 1> exit Now instead of granting carl  and aliace the 'Zone Management' profile and the authorisations directly we can just give them the appropriate profile. # usermod -P +'Zone Group 3' carl # usermod -P +'Zone Group 1' alice If we wanted to store the profile data and the profiles granted to the users in LDAP just add '-S ldap' to the profiles and usermod commands. For a documentation overview see the description of the "admin" resource in zonecfg(1M), profiles(1) and usermod(1M)

    Read the article

  • Formating Columns in Excel created by af:exportCollectionActionListener

    - by Duncan Mills
    The af:exportCollectionActionListener behavior in ADF Faces Rich client provides a very simple way of quickly dumping out the contents or selected rows in a table or treeTable to Excel. However, that simplicity comes at a price as it pretty much left up to Excel how to format the data. A common use case where you have a problem is that of ID columns which are often long numerics. You probably want to represent this data as a string, Excel however will probably have other ideas and render it as an exponent  - not what you intended. In earlier releases of the framework you could sort of work around this by taking advantage of a bug which would allow you to surround the outputText in question with invisible outputText components which provided formatting hints to Excel. Something like this: <af:column headertext="Some wide label">  <af:panelgrouplayout layout="horizontal">     <af:outputtext value="=TEXT(" visible="false">     <af:outputtext value="#{row.bigNumberValue}" rendered="true"/>    <af:outputtext value=",0)" visible="false">   </af:panelgrouplayout> </af:column> However, this bug was fixed and so it can no longer be used as a trick, the export now ignores invisible columns. So, if you really need control over the formatting there are several alternatives: First the more powerful ADF Desktop Integration (ADFdi) package which allows you to build fully transactional spreadsheets that "pull" the data and can update it. This gives you all the control that might need on formatting but it does need specific Excel Add-ins on the client to work. For more information about ADFdi have a look at this tutorial on OTN. Or you can of course look at BI Publisher or Apache POI if you're happy with output only spreadsheets

    Read the article

  • T-4 Templates for ASP.NET Web Form Databound Control Friendly Logical Layers

    - by joycsharp
    I just released an open source project at codeplex, which includes a set of T-4 templates to enable you to build logical layers (i.e. DAL/BLL) with just few clicks! The logical layers implemented here are  based on Entity Framework 4.0, ASP.NET Web Form Data Bound control friendly and fully unit testable. In this open source project you will get Entity Framework 4.0 based T-4 templates for following types of logical layers: Data Access Layer: Entity Framework 4.0 provides excellent ORM data access layer. It also includes support for T-4 templates, as built-in code generation strategy in Visual Studio 2010, where we can customize default structure of data access layer based on Entity Framework. default structure of data access layer has been enhanced to get support for mock testing in Entity Framework 4.0 object model. Business Logic Layer: ASP.NET web form based data bound control friendly business logic layer, which will enable you few clicks to build data bound web applications on top of ASP.NET Web Form and Entity Framework 4.0 quickly with great support of mock testing. Download it to make your web development productive. Enjoy!

    Read the article

  • SQL SERVER – Identifying guest User using Policy Based Management

    - by pinaldave
    If you are following my recent blog posts, you may have noticed that I’ve been writing a lot about Guest User in SQL Server. Here are all the blog posts which I have written on this subject: SQL SERVER – Disable Guest Account – Serious Security Issue SQL SERVER – Force Removing User from Database – Fix: Error: Could not drop login ‘test’ as the user is currently logged in SQL SERVER – Detecting guest User Permissions – guest User Access Status SQL SERVER – guest User and MSDB Database – Enable guest User on MSDB Database One of the requests I received was whether we could create a policy that would prevent users unable guest user in user databases. Well, here is a quick tutorial to answer this. Let us see how quickly we can do it. Requirements Check if the guest user is disabled in all the user-created databases. Exclude master, tempdb and msdb database for guest user validation. We will create the following conditions based on the above two requirements: If the name of the user is ‘guest’ If the user has connect (@hasDBAccess) permission in the database Check in All user databases, except: master, tempDB and msdb Once we create two conditions, we will create a policy which will validate the conditions. Condition 1: Is the User Guest? Expand the Database >> Management >> Policy Management >> Conditions Right click on the Conditions, and click on “New Condition…”. First we will create a condition where we will validate if the user name is ‘guest’, and if it’s so, then we will further validate if it has DB access. Check the image for the necessary configuration for condition: Facet: User Expression: @Name = ‘guest’ Condition 2: Does the User have DBAccess? Expand the Database >> Management >> Policy Management >> Conditions Right click on Conditions and click on “New Condition…”. Now we will validate if the user has DB access. Check the image for necessary configuration for condition: Facet: User Expression: @hasDBAccess = False Condition 3: Exclude Databases Expand the Database >> Management >> Policy Management >> Conditions Write click on Conditions and click on “New Condition…” Now we will create condition where we will validate if database name is master, tempdb or msdb and if database name is any of them, we will not validate our first one condition with them. Check the image for necessary configuration for condition: Facet: Database Expression: @Name != ‘msdb’ AND @Name != ‘tempdb’ AND @Name != ‘master’ The next step will be creating a policy which will enforce these conditions. Creating a Policy Right click on Policies and click “New Policy…” Here, we justify what condition we want to validate against what the target is. Condition: Has User DBAccess Target Database: Every Database except (master, tempdb and MSDB) Target User: Every User in Target Database with name ‘guest’ Now we have options for two evaluation modes: 1) On Demand and 2) On Schedule We will select On Demand in this example; however, you can change the mode to On Schedule through the drop down menu, and select the interval of the evaluation of the policy. Evaluate the Policies We have selected OnDemand as our policy evaluation mode. We will now evaluate by means of executing Evaluate policy. Click on Evaluate and it will give the following result: The result demonstrates that one of the databases has a policy violation. Username guest is enabled in AdventureWorks database. You can disable the guest user by running the following code in AdventureWorks database. USE AdventureWorks; REVOKE CONNECT FROM guest; Once you run above query, you can already evaluate the policy again. Notice that the policy violation is fixed now. You can change the method of the evaluation policy to On Schedule and validate policy on interval. You can check the history of the policy and detect the violation. Quiz I have created three conditions to check if the guest user has database access or not. Now I want to ask you: Is it possible to do the same with 2 conditions? If yes, HOW? If no, WHY NOT? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Best Practices, CodeProject, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: Policy Management

    Read the article

  • Siebel Troubleshooting : An ODBC error occurred; SBL-GEN-03006: Error calling function: DICFindTable m_pReqTbl

    - by Giri Mandalika
    Symptom: A newly installed Siebel application server fails to start despite successful ODBC connectivity to the database. SRProc process logs ODBC error messages similar to the following: Message: GEN-13, Additional Message: dict-ERR-1109: Unable to read value from export file (Data length (32) Column definition (3)). Message: GEN-13, Additional Message: dict-ERR-1107: Unable to read row 0 from export file (UTLDataValRead pBuf, col 4 ). GenericLog GenericError 1 0002157.. 11-11-18 13:28 Message: Generated SQL statement:, Additional Message: SQLFetch: SELECT RDOBJ.DOCK_ID, RDOBJ.RELATED_DOCK_ID, RDOBJ.SQL_STATEMENT, RDOBJ.CHECK_VISIBILITY, 'N', RDOBJ.COMMENTS, RDOBJ.ACTIVE, RDOBJ.SEQUENCE, RDOBJ.VIS_STRENGTH, RDOBJ.REL_VIS_STRENGTH, RDOBJ.VIS_EVT_COLS FROM ORAPERF.S_DOCK_REL_DOBJ RDOBJ, ORAPERF.S_DOCK_OBJECT DOBJ WHERE RDOBJ.REPOSITORY_ID = (SELECT ROW_ID FROM ORAPERF.S_REPOSITORY WHERE NAME = ?) AND DOBJ.ROW_ID = RDOBJ.DOCK_ID AND (DOBJ.INACTIVE_FLG = 'N' OR DOBJ.INACTIVE_FLG IS NULL) AND (RDOBJ.INACTIVE_FLG = 'N' OR RDOBJ.INACTIVE_FLG IS NULL) Message: Error: An ODBC error occurred, Additional Message: Function: DICGetRDObjects; ODBC operation: SQLFetch Message: GEN-13, Additional Message: dict-ERR-1109: Unable to read value from export file (UTLCompressFRead (fseek)). Message: GEN-13, Additional Message: dict-ERR-1107: Unable to read row 0 from export file (UTLDataValRead pBuf, col 0 ). Message: GEN-10, Additional Message: Calling Function: DICLoadDObjectInfo; Called Function: Calling DICGetRDObjects Message: GEN-10, Additional Message: Calling Function: DICLoadDict; Called Function: DICLoadDObjectInfo GenericError (srpdb.cpp (860) err=3006 sys=2) SBL-GEN-03006: Error calling function: DICFindTable m_pReqTbl (srpsmech.cpp (74) err=3006 sys=0) SBL-GEN-03006: Error calling function: DICFindTable m_pReqTbl (srpmtsrv.cpp (107) err=3006 sys=0) SBL-GEN-03006: Error calling function: DICFindTable m_pReqTbl (smimtsrv.cpp (1203) err=3006 sys=0) SBL-GEN-03006: Error calling function: DICFindTable m_pReqTbl SmiLayerLog Error Terminate process due to unrecoverable error: 3006. (Main Thread) An inconsistent or corrupted dictionary file "diccache.dat" is likely the cause. Solution: Stop the application server and manually kill the remaining Siebel application specific processes eg., stop_server all pkill siebmtsh pkill siebproc .. Remove $SIEBEL_HOME/bin/diccache.dat file. It will be re-generated during the application server startup Start the application server start_server all

    Read the article

  • CodePlex Daily Summary for Friday, June 29, 2012

    CodePlex Daily Summary for Friday, June 29, 2012Popular ReleasesSupporting Guidance and Whitepapers: v1 - Supporting Media: Welcome to the Release Candidate (RC) release of the ALM Rangers Readiness supporting edia As this is a RC release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these RC artifacts in a production environment. Quality-Bar Details All critical bugs have been resolved Known Issues / Bugs Practical Ruck training workshop not yet includedDesigning Windows 8 Applications with C# and XAML: Chapters 1 - 7 Release Preview: Source code for all examples from Chapters 1 - 7 for the Release PreviewDataBooster - Extension to ADO.NET Data Provider: DataBooster Library for Oracle + SQL Server Beta2: This is a derivative library of dbParallel project http://dbparallel.codeplex.com. All above binaries releases require .NET Framework 4.0 or later. SQL Server support is always build-in (can't be unplugged). The first download (DLL) also requires ODP.NET to connect Oracle; The second download (DLL) also requires DataDirect(3.5) to connect Oracle; The third download (DLL) doesn't support Oracle. Please download the source code if the provider need to be replaced by others. For example ODP.NE...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.57: Fix for issue #18284: evaluating literal expressions in the pattern c1 * (x / c2) where c1/c2 is an integer value (as opposed to c2/c1 being the integer) caused the expression to be destroyed.Visual Studio ALM Quick Reference Guidance: v2 - Visual Studio 2010 (Japanese): Rex Tang (?? ??) http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/12/08/introducing-the-visual-studio-alm-rangers-rex-tang.aspx, Takaho Yamaguchi (?? ??), Masashi Fujiwara (?? ??), localized and reviewed the Quick Reference Guidance for the Japanese communities, based on http://vsarquickguide.codeplex.com/releases/view/52402. The Japanese guidance is available in AllGuides and Everything packages. The AllGuides package contains guidances in PDF file format, while the Everything packag...Visual Studio Team Foundation Server Branching and Merging Guide: v1 - Visual Studio 2010 (Japanese): Rex Tang (?? ??) http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/12/08/introducing-the-visual-studio-alm-rangers-rex-tang.aspx, Takaho Yamaguchi (?? ??), Hirokazu Higashino (?? ??), localized and reviewed the Branching Guidance for the Japanese communities, based on http://vsarbranchingguide.codeplex.com/releases/view/38849. The Japanese guidance is available in AllGuides and Everything packages. The AllGuides package contains guidances in PDF file format, while the Everything packag...SQL Server FineBuild: Version 3.1.0: Top SQL Server FineBuild Version 3.1.0This is the stable version of FineBuild for SQL Server 2012, 2008 R2, 2008 and 2005 Documentation FineBuild Wiki containing details of the FineBuild process Known Issues Limitations with this release FineBuild V3.1.0 Release Contents List of changes included in this release Please DonateFineBuild is free, but please donate what you think FineBuild is worth as everything goes to charity. Tearfund is one of the UK's leading relief and de...EasySL: RapidSL V2: Rewrite RapidSL UI Framework, Using Silverlight 5.0 EF4.1 Code First Ria Service SP2 + Lastest Silverlight Toolkit.NETDeob0: NETDeob 0.1.1: http://i.imgur.com/54C78.pngSOLID by example: All examples: All solid examplesSiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1726.406): Use of new version of connection controls for a full support of OSDP authentication mechanism for CRM Online.Umbraco CMS: Umbraco CMS 5.2: Development on Umbraco v5 discontinued After much discussion and consultation with leaders from the Umbraco community it was decided that work on the v5 branch would be discontinued with efforts being refocused on the stable and feature rich v4 branch. For full details as to why this decision was made please watch the CodeGarden 12 Keynote. What about all that hard work?!?? We are not binning everything and it does not mean that all work done on 5 is lost! we are taking all of the best and m...CodeGenerate: CodeGenerate Alpha: The Project can auto generate C# code. Include BLL Layer、Domain Layer、IDAL Layer、DAL Layer. Support SqlServer And Oracle This is a alpha program,but which can run and generate code. Generate database table info into MS WordXDA ROM HUB: XDA ROM HUB v0.9: Kernel listing added -- Thanks to iONEx Added scripts installer button. Added "Nandroid On The Go" -- Perform a Nandroid backup without a PC! Added official Android app!ExtAspNet: ExtAspNet v3.1.8.2: +2012-06-24 v3.1.8 +????Grid???????(???????ExpandUnusedSpace????????)(??)。 -????MinColumnWidth(??????)。 -????AutoExpandColumn,???????????????(ColumnID)(?????ForceFitFirstTime??ForceFitAllTime,??????)。 -????AutoExpandColumnMax?AutoExpandColumnMin。 -????ForceFitFirstTime,????????????,??????????(????????????)。 -????ForceFitAllTime,????????????,??????????(??????????????????)。 -????VerticalScrollWidth,????????(??????????,0?????????????)。 -????grid/grid_forcefit.aspx。 -???????????En...AJAX Control Toolkit: June 2012 Release: AJAX Control Toolkit Release Notes - June 2012 Release Version 60623June 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ASP.NET 2.0. The latest version that is compatible with ASP.NET 2.0 can be found here: 11121. - Pages using ...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.5: Version: 2.5.0.5 (Milestone 5): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Add IsInDesignMode property to the WafConfiguration class. WAF: Introduce the IModuleController interface. WAF: Add ...Windows 8 Metro RSS Reader: Metro RSS Reader.v7: Updated for Windows 8 Release Preview Changed background and foreground colors Used VariableSizeGrid layout to wrap blog posts with images Sort items with Images first, text-only last Enabled Caching to improve navigation between framesConfuser: Confuser 1.9: Change log: * Stable output (i.e. given the same seed & input assemblies, you'll get the same output assemblies) + Generate debug symbols, now it is possible to debug the output under a debugger! (Of course without enabling anti debug) + Generating obfuscation database, most of the obfuscation data in stored in it. + Two tools utilizing the obfuscation database (Database viewer & Stack trace decoder) * Change the protection scheme -----Please read Bug Report before you report a bug-----...XDesigner.Development: First release: First releaseNew ProjectsArcGIS Server Rest Catalog: jquery widget that display all MapServer service of ESRI ArcGIS Server in accordion control. Require jQuery and JsRender. BugsBox.Lib: BugsBoxlibCodeStudy: The project includes all my code written for .NET studyCommunity Tfs Team Tools: Community TFS Team Tools is a community project based on the example code from ALM Rangers - Quick Response Sample Command line utility to manage TFS TeamsDiveBaseManager: DiveBaseManagerEclipse App: Aplicacion para android para el envio de coordenaads a un servidorJQMdotNET: JQMdotNet is an early attempt to make a series of MVC HTML helpers to quickly render JQuery Mobile pages.MathBuilderFramework: Math Builder FX is a framework for math problems. The idea of this framework is to create real exercises of mathematics throught base classes.MDS MODELING WORKBOOK: MDS Modeling Workbook is a modeling tool and a solution accelerator for Microsoft Master Data Services. Minesweeper: a clean old minesweeperMultithreaded Port Scanner Utility: Mulithreaded Port Scanner Utility is a very simple port scanner that take advantage of the new System.Threading.Task namespace in .Net 4.0Nonocast.Data: Nonocast.Data is a free, open source developer focused object persistence for small and medium software.On{X} Scripts: This project is to share scripts I create or modify from available free/opensource scripts (off-course with due mentions & links to original developer).PCS MAP: ladPowerShell Module for Mayhem: A module for Mayhem with a reaction which executes a PowerShell script.PunkBuster™ Screenshot Viewer: PunkBuster™ Screenshot Viewer shows screenshots of all games protected with PunkBuster™.Random reminder: Objective: To create a simple Windows application that can schedule and automatically reschedule a random timer that falls within a defined interval.Remembrall: Some useful stuff (at least for me) ! Javascript plugins jQuery plugins C# utilities ASP.NET MVC helper extensionsrepotfs: Repositório de exemplosSharepoint Designer: Sharepoint ExplororSPFluid: Simple modularity and data access framework.SPManager: Helper classSQL Server : xp_fixeddrives2: C++ Extended Stored Procedure Get volumes InfoSWORD Extensions for Sharepoint: Push documents in and out of Sharepoint using the SWORD protocol. ***incomplete***Test projects: My testing projecttestdd06282012git01: xzc testdd06282012git1: sadtestdd06282012tfs01: xzctestddhg06282012: xctestddtfs062820121: ,lm;.testhg06282012dd1: zxTlrk: NATotal_Mayhem: Add-on for Mayhem ( http://mayhem.codeplex.com/ ) that includes networking events, power reactions, and more.WallSwitch: An application to cycle your desktop wallpaper.??????: ????????

    Read the article

  • I Didn&rsquo;t Get You Anything&hellip;

    - by Bob Rhubart
    Nearly every day this blog features a  list posts and articles written by members of the OTN architect community. But with Christmas just days away, I thought a break in that routine was in order. After all, if the holidays aren’t excuse enough for an off-topic post, then the terrorists have won. Rather than buy gifts for everyone -- which, given the readership of this blog and my budget could amount to a cash outlay of upwards of $15.00 – I thought I’d share a bit of holiday humor. I wrote the following essay back in the mid-90s, for a “print” publication that used “paper” as a content delivery system.  That was then. I’m older now, my kids are older, but my feelings toward the holidays haven’t changed… It’s New, It’s Improved, It’s Christmas! The holidays are a time of rituals. Some of these, like the shopping, the music, the decorations, and the food, are comforting in their predictability. Other rituals, like the shopping, the  music, the decorations, and the food, can leave you curled into the fetal position in some dark corner, whimpering. How you react to these various rituals depends a lot on your general disposition and credit card balance. I, for one, love Christmas. But there is one Christmas ritual that really tangles my tinsel: the seasonal editorializing about how our modern celebration of the holidays pales in comparison to that of Christmas past. It's not that the old notions of how to celebrate the holidays aren't all cozy and romantic--you can't watch marathon broadcasts of "It's A Wonderful White Christmas Carol On Thirty-Fourth Street Story" without a nostalgic teardrop or two falling onto your plate of Christmas nachos. It's just that the loudest cheerleaders for "old-fashioned" holiday celebrations overlook the fact that way-back-when those people didn't have the option of doing it any other way. Dashing through the snow in a one-horse open sleigh? No thanks. When Christmas morning rolls around, I'm going to be mighty grateful that the family is going to hop into a nice warm Toyota for the ride over to grandma's place. I figure a horse-drawn sleigh is big fun for maybe fifteen minutes. After that you’re going to want Old Dobbin to haul ass back to someplace warm where the egg nog is spiked and the family can gather in the flickering glow of a giant TV and contemplate the true meaning of football. Chestnuts roasting on an open fire? Sorry, no fireplace. We've got a furnace for heat, and stuffing nuts in there voids the warranty. Any of the roasting we do these days is in the microwave, and I'm pretty sure that if you put chestnuts in the microwave they would become little yuletide hand grenades. Although, if you've got a snoot full of Yule grog, watching chestnuts explode in your microwave might be a real holiday hoot. Some people may see microwave ovens as a symptom of creeping non-traditional holiday-ism. But I'll bet you that if there were microwave ovens around in Charles Dickens' day, the Cratchits wouldn't have had to entertain an uncharacteristically giddy Scrooge for six or seven hours while the goose cooked. Holiday entertaining is, in fact, the one area that even the most severe critic of modern practices would have to admit has not changed since Tim was Tiny. A good holiday celebration, then as now, involves lots of food, free-flowing drink, and a gathering of friends and family, some of whom you are about as happy to see as a subpoena. Just as the Cratchit's Christmas was spent with a man who, for all they knew, had suffered some kind of head trauma, so the modern holiday gathering includes relatives or acquaintances who, because they watch too many talk shows, and/or have poor personal hygiene, and/or fail to maintain scheduled medication, you would normally avoid like a plate of frosted botulism. But in the season of good will towards men, you smile warmly at the mystery uncle wandering around half-crocked with a clump of mistletoe dangling from the bill of his N.R.A. cap. Dickens' story wouldn't have become the holiday classic it has if, having spotted on their doorstep an insanely grinning, raw poultry-bearing, fresh-off-a-rough-night Scrooge, the Cratchits had pulled their shades and pretended not to be home. Which is probably what I would have done. Instead, knowing full well his reputation as a career grouch, they welcomed him into their home, and we have a touching story that teaches a valuable lesson about how the Christmas spirit can get the boss to pump up the payroll. Despite what the critics might say, our modern Christmas isn't all that different from those of long ago. Sure, the technology has changed, but that just means a bigger, brighter, louder Christmas, with lasers and holograms and stuff. It's our modern celebration of a season that even the least spiritual among us recognizes as a time of hope that the nutcases of the world will wake up and realize that peace on earth is a win/win proposition for everybody. If Christmas has changed, it's for the better. We should continue making Christmas bigger and louder and shinier until everybody gets it.  *** Happy Holidays, everyone!   del.icio.us Tags: holiday,humor Technorati Tags: holiday,humor

    Read the article

  • UPK Content State

    - by peter.maravelias
    State is an editable property for communicating the status of a document in the UPK library. This is particularly helpful when working with other authors in a development team. Authors can assign a state to any document using the values that are defined in the master list. The default master list of State values includes Not Started, Draft, In Review, and Final (in the language installed on the server). Administrators can customize the list by adding, deleting, or renaming the values as well as sequencing the values as they will appear on the assignment list from the Properties pane. Let us know if or how you are using UPK Content States in your development efforts!

    Read the article

  • Social Business Forum Milano: Day 2

    - by me
    @YourService. The business world has flipped and small business can capitalize  by Frank Eliason (twitter: @FrankEliason ) Technology and social media tools have made it easier than ever for companies to communicate with consumers. They can listen and join in on conversations, solve problems, get instant feedback about their products and services, and more. So why, then, are most companies not doing this? Instead, it seems as if customer service is at an all time low, and that the few companies who are choosing to focus on their customers are experiencing a great competitive advantage. At Your Service explains the importance of refocusing your business on your customers and your employees, and just how to do it. Explains how to create a culture of empowered employees who understand the value of a great customer experience Advises on the need to communicate that experience to their customers and potential customers Frank Eliason, recognized by BusinessWeek as the 'most famous customer service manager in the US, possibly in the world,' has built a reputation for helping large businesses improve the way they connect with customers and enhance their relationships Quotes from the Audience: Bertrand Duperrin ?@bduperrin social service is not about shutting up the loudest cutsomers ! #sbf12 @frankeliason Paolo Pelloni ?@paolopelloniGautam Ghosh ?@GautamGhosh RT @cecildijoux: #sbf12 @frankeliason you need to change things and fix the approach it's not about social media it's about driving change  Peter H. Reiser ?@peterreiser #sbf12 Company Experience = Product Experience + Customer Interactions + Employee Experience @yourservice Engage or lose! Socialize, mobilize, conversify: engage your employees to improve business performance Christian Finn (twitter: @cfinn) First Christian was presenting the flying monkey   Then he outlined the four principals to fix the Intranet: 1. Socalize the Intranet 2. Get Thee to a Single Repository 3. Mobilize the Intranet 4. Conversationalize Your Processes Quotes from the Audience: Oscar Berg ?@oscarberg Engaged employees think their work bring out the best of their ideas @cfinn #sbf12 http://pic.twitter.com/68eddp48 John Stepper ?@johnstepper I like @cfinn's "conversify your processes" A nice related concept to "narrating your work", part of working out loud. http://johnstepper.com/2012/05/26/working-out-loud-your-personal-content-strategy/ Oscar Berg ?@oscarberg Organizations are talent markets - socializing your intranet makes this market function better @cfinn #sbf12 For profit, productivity, and personal benefit: creating a collaborative culture at Deutsche Bank John Stepper (twitter:@johnstepper) Driving adoption of collaboration + social media platforms at Deutsche Bank. John shared some great best practices on how to deploy an enterprise wide  community model  in a large company. He started with the most important question What is the commercial value of adding social ? Then he talked about the success of Community of Practices deployment and outlined some key use cases including the relevant measures to proof the ROI of the investment. Examples:  Community of practice -> measure: systematic collection of value stories  Self-service website  -> measure: based on representative models Optimizing asset inventory - > measure: Actual counts  This use case was particular interesting.  It is a crowd sourced spending/saving of infrastructure model.  User can cancel IT services they don't need (as example Software xx).  5% of the saving goes to social responsibility projects. The John outlined some  best practices on how to address the WIIFM (What's In It For Me) question of the individual users:  - change from hierarchy to graph -  working out loud = observable work + narrating  your work  - add social skills to career objectives - example: building a purposeful social network course/training as part of the job development curriculum And last but not least John gave some important tips on how to get senior management buy-in by establishing management sponsored division level collaboration boards which defines clear uses cases and measures. This divisional use cases are then implemented using a common social platform.  Thanks John - I learned a lot from your presentation!   Quotes from the Audience: Ana Silva ?@AnaDataGirl #sbf12 what's in it for individuals at Deutsche Bank? Shapping their reputations in a big org says @johnstepper #e20Ana Silva ?@AnaDataGirl Any reason why not? MT @magatorlibero #sbf12 is Deutsche B. experience on applying social inside company applicable to Italian people? Oscar Berg ?@oscarberg Your career is not a ladder, it is a network that opens up opportunities - @johnstepper #sbf12 Oscar Berg ?@oscarberg @johnstepper: Institutionalizing collaboration is next - collaboration woven into the fabric of daily work #sbf12 Ana Silva ?@AnaDataGirl #sbf12 @johnstepper talking about how Deutsche Bank is using #socbiz to build purposeful CoP & save money

    Read the article

< Previous Page | 657 658 659 660 661 662 663 664 665 666 667 668  | Next Page >