Search Results

Search found 4458 results on 179 pages for 'individual improvement'.

Page 54/179 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • Patrick Curran Session-Keynote at DOAG 2012

    - by Heather VanCura
    Patrick Curran, Chair of the  Java Community Process (JCP) and Director of the JCP Program Management Office, will be speaking this week at the DOAG 2012 event in Nuremberg Germany. Keynote Java: Restructuring the Java Community ProcessNovember, 22nd | 09:00-09:45 am The Java Community Process (JCP) plays a critical role in the evolution of Java.  This keynote will explain how the JCP is organized and how interested members of the Java community - commercial organizations, non-profits, Java user-groups, and individual developers - work together to advance the Java language and platforms. It will then discuss recent and upcoming changes to the JCP's structure and operating processes, and will explain how these changes ('JCP.next') will make the organization more efficient and will ensure that its work is carried out in a more open and more transparent manner.

    Read the article

  • Alpha blending without depth writing

    - by teodron
    A recurring problem I get is this one: given two different billboard sets with alpha textures intended to create particle special effects (such as point lights and smoke puffs), rendering them correctly is tedious. The issue arising in this scenario is that there's no way to use depth writing and make certain billboards obey depth information as they appear in front of others that are clearly closer to the camera. I've described the problem on the Ogre forums several times without any suggestions being given (since the application I'm writing uses their engine). What could be done then? sort all individual billboards from different billboard sets to avoid writing the depth and still have nice alpha blended results? If yes, please do point out some resources to start with in the frames of the aforementioned Ogre engine. Any other suggestions are welcome!

    Read the article

  • LibreOffice Icons are blurry

    - by Ryan McClure
    My LibreOffice icons for the individual apps are fine, they look great in both the launcher and the switcher. Yet, if I open the apps from the main LibreOffice program or if I open the document that I want to edit (And it opens its own "icon" as it always does), the icon is incredibly blurry. Here's what it looks like: On the launcher, I put the actual Calc and Impress launchers and on their left is the icons opened from a document. I know they aren't "blurry" as much as they are smaller. What should I do to remedy this? They are the same in the switcher (I can't find a way to take a screenshot of the switcher) Edit: I changed my Unity plugin from Rotated to standard, and the problem is still there; so, it isn't a Rotated bug.

    Read the article

  • Unity Dash/Lens - Auto-completion based on recent search strings

    - by Anant
    Sometimes, I find myself typing the same (or quite similar) search strings inside a Unity Lens. So, I thought whether it's possible for the Lens to remember previous searches, and provide a drop-down menu of possible suggestions (based on the past) when I start typing my new query. With Lenses for sites like Wikipedia and DuckDuckGo, the search strings are getting longer, and this feature would lend a helping hand in filling out queries faster. This could be something applicable to all Lenses, with later versions allowing individual Lenses to run their own auto-completion algorithm.

    Read the article

  • Checkers AI Algorithm

    - by John
    I am making an AI for my checkers game and I'm trying to make it as hard as possible. Here is the current criteria for a move on the hardest difficulty: 1: Look For A Block: This is when a piece is being threatened and another piece can be moved in behind it to protect it. Here is an example: Black Moves |W| |W| |W| |W| | | |W| |W| |W| |W| |W| | | |W| |W| | | | | |W| | | | | | | | | |B| | | | | |B| | | |B| |B| |B| |B| |B| |B| | | |B| |B| |B| |B| White Blocks |W| |W| |W| |W| | | |W| | | |W| |W| |W| |W| |W| |W| | | | | |W| | | | | | | | | |B| | | | | |B| | | |B| |B| |B| |B| |B| |B| | | |B| |B| |B| |B| 2: Move pieces out of danger: if any piece is being threatened, and a piece cannot block for that piece, then it will attempt to move out of the way. If the piece cannot move out of the way without still being in danger, the computer ignores the piece. 3: If the computer player owns any kings, it will attempt to 'hunt down' enemy pieces on the board, if no moves can be made that won't in danger the king or any other pieces, the computer ignores this rule. 4: Any piece that is owned by the computer that is in column 1 or 6 will attempt to go to a side. When a piece is in column 0 or 7, it is in a very strategic position because it cannot get captured while it is in either of these columns 5: It makes an educated random move, the move will not indanger the piece that is moving or any piece that is on the board. 6: If none of the above are possible it makes a random move. This question is not really specific to any language but if all examples could be in Java that would be great, considering this app is written in android. Does anyone see any room for improvement in this algorithm? Anything that would make it better at playing checkers?

    Read the article

  • Creating a Successful Cloud Roadmap

    - by stephen.g.bennett
    No matter what type of cloud services or deployment models you are considering as part of your overall IT strategy, you must have a cloud services adoption roadmap to guide your journey. A cloud services adoption roadmap provides guidance that enables multiple projects to progress in parallel yet remain coordinated and ultimately result in a common end goal. The cloud services adoption roadmap consists of program-level efforts and a portfolio of cloud services. The program-level effort creates strategic assets such as the cloud architecture, cloud infrastructure, cloud governance, risk, and compliance (GRC) processes, and security policies that are leveraged across all the individual projects. A feature article on this topic can be found in the latest SOA and Cloud Magazine.

    Read the article

  • SQL SERVER – 2014 Announced and SQL Server 2014 Datasheet

    - by Pinal Dave
    Earlier this week Microsoft has announced SQL Server 2014. The release date of Trial of SQL Server to be believed later this year. Here are few of the improvements which I was looking forward are there in this release. Please note this is not the exhaustive list of the features of SQL Server 2014. This is the a quick note on the features which I was looking forward and now will be available in this release. Always On supports now 8 secondaries instead of 4 Online Indexing at partition level – this is a good thing as now index rebuilding can be done at a partition level Statistics at the partition level – this will be a huge improvement in performance In-Memory OLTP works by providing in-application memory storage for the most often used tables in SQL Server. (Read More Here) Columnstore Index can be updated – I just can’t wait for this feature (Columnstore Index) Resource Governor can control IO along with CPU and Memory Increase performance by extending SQL Server in-memory buffer pool to SSDs Backup to Azure Storage You can additionally download SQL Server 2014 Datasheet from here. Which is your favorite new feature of SQL Server 2014? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Sprite sheets, Clamp or Wrap?

    - by David
    I'm using a combination of sprite sheets for well, sprites and individual textures for infinite tiling. For the tiling textures I'm obviously using Wrap to draw the entire surface in one call but up until now I've been making a seperate batch using Clamp for drawing sprites from the sprite sheets. The sprite sheets include a border (repeating the edge pixels of each sprite) and my code uses the correct source coordinates for sprites. But since I'm never giving coordinates outside of the texture when drawing sprites (and indeed the border exists to prevent bleed over when filtering) it's struck me that I'd be better off just using Wrap so that I can combine everything into one batch. I just want to be sure that I haven't overlooked something obvious. Is there any reason that Wrap would be harmful when used with a sprite sheet?

    Read the article

  • Data Storage Options

    - by Kenneth
    When I was working as a website designer/engineer I primarily used databases for storage of much of my dynamic data. It was very easy and convenient to use this method and seemed like a standard practice from my research on the matter. I'm now working on shifting away from websites and into desktop applications. What are the best practices for data storage for desktop applications? I ask because I have noticed that most programs I use on a personal level don't appear to use a database for data storage unless its embedded in the program. (I'm not thinking of an application like a word processor where it makes sense to have data stored in individual files as defined by the user. Rather I'm thinking of something more along the lines of a calendar application which would need to store dates and event info and such where accessing that information would be much easier if stored in a database... at least as far as my experience would indicate.) Thanks for the input!

    Read the article

  • How Circuit Boards Are Manufactured and Tested [Video]

    - by Jason Fitzpatrick
    Circuit boards are in nearly everything: computers, cars, toys, phones, even greeting cards. Check out this tour of Printed Circuit Board (PCB) factory to see how they’re made. In the above video the owners of Base2 Electronics are watching a PCB testing machine at the factory where they purchase their boards for resale. The machine is first scanning the board to identify it in the board database and then the arms start flying as it tests individual circuits on the board. If you’re interested seeing all the steps of the manufacturing process, hit up the link below for a photo and video tour of the facility. Base2 Electronics Tour of Advanced Circuits [via Hack A Day] How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • London-based IT Training company seeks developers interested in achieving Microsoft Certifications

    IT Training company MCP Guru, based near Canary Wharf, looking to fill last available places on several Microsoft courses.All certifications available. Learners can study in-class, at work or at home, on weekdays and weekends, day or night.All instructors possess several years software and web development experience, and all are fully licensed.Individual learners get 30% discount, groups of 2 or more get 50% discount.Hurry! Last few places remaining! Offer ends April 30th.Contact Jatinder at [email protected] you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Platform for Efficiency: Boeing Defense, Space & Security integrates supply chain processes using Oracle Business Process Management solutions. by Fred Sandsmark

    - by JuergenKress
    Like most companies, aerospace giant Boeing has its jargon - words and phrases that uniquely define its products and processes. Take the word platform. It is used at Boeing to mean a family of aircraft - the F/A-18 fighter, for example, or the 777 jetliner. Boeing Defense, Space & Security since August 2009, employees in the Global Services & Support (GS&S) division of Boeing Defense, Space & Security have been talking about a different sort of platform: a supply chain technology platform, based on Oracle Business Process Management (Oracle BPM) solutions and Oracle SOA Suite. That platform, built with the assistance of Oracle Diamond Partner Capgemini, is serving as a jumping-off point for Boeing's GS&S staff to deploy radically improved business processes supported by Oracle Fusion Applications to build a high-visibility, end-to-end supply chain. This business process-driven technology platform has ambitious goals: to help GS&S respond more quickly and accurately to its customers' needs, to make business processes at all GS&S sites more consistent and less expensive, and to create a foundation for further improvement and efficiency. Read the full article here. Want to publish your BPM11g success story - request for a partner/customer reference? BPM Center of Excellent & First 100 Days of BPM documents to our SOA Community Workspace MWD_bpm_si_Centre_of_Excellence_0811.pdf First 100 Days of BPM whitepaper.pdf Please visit our SOA Community Workspace (SOA Community membership required). SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: BPM,BPM reference,BPM Capgemini,BPM first 100 days,BPM center of Excellence,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Repairing The Visual Studio 2012 UI

    - by Ken Cox [MVP]
    I have sympathy for ‘Softies who don’t like the controversial ‘Metro’ UI changes but are afraid to say so. After all, who wants to commit a CLM (Career-Limiting Move) by declaring that the Emperor has no clothes (or gradients) and that ALL CAPS IN MENUS ARE DUMB? Talk about power! Here’s a higher-up (anyone got a name?) who has enforced a flat, monochrome, uninteresting user interface in Visual Studio 2012  that has been damned with faint praise by consumers. The pushback must have been enormous. Some ‘Softies disengage from the raging debate with, “It’s not my decision” while others feebly point out that the addition of some colour pixels in the icons is a real improvement over the beta version. True, I guess. With the UI pretty much locked, its down to repairing the damage. Fortunately, some Empire dissident has leaked the news to a blogger that  those SHOUTING CAPs aren’t hardcoded afterall: How To Prevent Visual Studio 2012 ALL CAPS Menus And so it goes. By RTM, I’m sure there will be many more add-ons to help us ‘de-Metro’ VS 2012 and recreate our favourite Visual Studio 2010 themes for it.

    Read the article

  • SQL SERVER – Finding Size of a Columnstore Index Using DMVs

    - by pinaldave
    Columnstore Index is one of my favorite enhancement in SQL Server 2012. A columnstore index stores each column in a separate set of disk pages, rather than storing multiple rows per page as data traditionally has been stored. In case of the row store indexes multiple pages will contain multiple rows of the columns spanning across multiple pages. Whereas in case of column store indexes multiple pages will contain (multiple) single columns.  Columnstore Indexes are compressed by default and occupies much lesser space than regular row store index by default. One of the very common question I often see is need of the list of columnstore index along with their size and corresponding table name. I quickly re-wrote a script using DMVs sys.indexes and sys.dm_db_partition_stats. This script gives the size of the columnstore index on disk only. I am sure there will be advanced script to retrieve details related to components associated with the columnstore index. However, I believe following script is sufficient to start getting an idea of columnstore index size.  SELECT OBJECT_SCHEMA_NAME(i.OBJECT_ID) SchemaName, OBJECT_NAME(i.OBJECT_ID ) TableName, i.name IndexName, SUM(s.used_page_count) / 128.0 IndexSizeinMB FROM sys.indexes AS i INNER JOIN sys.dm_db_partition_stats AS S ON i.OBJECT_ID = S.OBJECT_ID AND I.index_id = S.index_id WHERE  i.type_desc = 'NONCLUSTERED COLUMNSTORE' GROUP BY i.OBJECT_ID, i.name Here is my introductory article written on SQL Server Fundamentals of Columnstore Index. Create a sample columnstore index based on the script described in the earlier article. It will give the following results. Please feel free to suggest improvement to script so I can further modify it to accommodate enhancements. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: ColumnStore Index

    Read the article

  • How do I get the compression on specific dynamic body

    - by Mike JM
    Sorry, I could not find any tag that would suit my question. Let me first show you the image and then write what I want to do: I'm using box2D. As you can see there are three dynamic bodies connected to each other (think of it as a table from front view).The LEG1 and LEG2 are connected to the static body. (it's the ground body). Another dynamic body is falling onto the table. I need to get the compression in the LEG1 and LEG2 separately. Joints have GetReactionForce() function which returns a b2Vec, which in turn has Length() and LengthSqd functions. This will give the total sum of the forces in any taken joint. But what I need is forces in individual bodies that are connected with joints. Once you connect several bodies with a single joint it again will show the sum of forces which is not useful.Here's the case iI'm talking about:

    Read the article

  • Using a Predicate as a key to a Dictionary

    - by Tom Hines
    I really love Linq and Lambda Expressions in C#.  I also love certain community forums and programming websites like DaniWeb. A user on DaniWeb posted a question about comparing the results of a game that is like poker (5-card stud), but is played with dice. The question stemmed around determining what was the winning hand.  I looked at the question and issued some comments and suggestions toward a potential answer, but I thought it was a neat homework exercise. [A little explanation] I eventually realized not only could I compare the results of the hands (by name) with a certain construct – I could also compare the values of the individual dice with the same construct. That piece of code eventually became a Dictionary with the KEY as a Predicate<int> and the Value a Func<T> that returns a string from the another structure that contains the mapping of an ENUM to a string.  In one instance, that string is the name of the hand and in another instance, it is a string (CSV) representation of of the digits in the hand. An added benefit is that the digits re returned in the order they would be for a proper poker hand.  For instance the hand 1,2,5,3,1 would be returned as ONE_PAIR (1,1,5,3,2). [Getting to the point] 1: using System; 2: using System.Collections.Generic; 3:   4: namespace DicePoker 5: { 6: using KVP_E2S = KeyValuePair<CDicePoker.E_DICE_POKER_HAND_VAL, string>; 7: public partial class CDicePoker 8: { 9: /// <summary> 10: /// Magical construction to determine the winner of given hand Key/Value. 11: /// </summary> 12: private static Dictionary<Predicate<int>, Func<List<KVP_E2S>, string>> 13: map_prd2fn = new Dictionary<Predicate<int>, Func<List<KVP_E2S>, string>> 14: { 15: {new Predicate<int>(i => i.Equals(0)), PlayerTie},//first tie 16:   17: {new Predicate<int>(i => i > 0), 18: (m => string.Format("Player One wins\n1={0}({1})\n2={2}({3})", 19: m[0].Key, m[0].Value, m[1].Key, m[1].Value))}, 20:   21: {new Predicate<int>(i => i < 0), 22: (m => string.Format("Player Two wins\n2={2}({3})\n1={0}({1})", 23: m[0].Key, m[0].Value, m[1].Key, m[1].Value))}, 24:   25: {new Predicate<int>(i => i.Equals(0)), 26: (m => string.Format("Tie({0}) \n1={1}\n2={2}", 27: m[0].Key, m[0].Value, m[1].Value))} 28: }; 29: } 30: } When this is called, the code calls the Invoke method of the predicate to return a bool.  The first on matching true will have its value invoked. 1: private static Func<DICE_HAND, E_DICE_POKER_HAND_VAL> GetHandEval = dh => 2: map_dph2fn[map_dph2fn.Keys.Where(enm2fn => enm2fn(dh)).First()]; After coming up with this process, I realized (with a little modification) it could be called to evaluate the individual values in the dice hand in the event of a tie. 1: private static Func<List<KVP_E2S>, string> PlayerTie = lst_kvp => 2: map_prd2fn.Skip(1) 3: .Where(x => x.Key.Invoke(RenderDigits(dhPlayerOne).CompareTo(RenderDigits(dhPlayerTwo)))) 4: .Select(s => s.Value) 5: .First().Invoke(lst_kvp); After that, I realized I could now create a program completely without “if” statements or “for” loops! 1: static void Main(string[] args) 2: { 3: Dictionary<Predicate<int>, Action<Action<string>>> main = new Dictionary<Predicate<int>, Action<Action<string>>> 4: { 5: {(i => i.Equals(0)), PlayGame}, 6: {(i => true), Usage} 7: }; 8:   9: main[main.Keys.Where(m => m.Invoke(args.Length)).First()].Invoke(Display); 10: } …and there you have it. :) ZIPPED Project

    Read the article

  • How to Search for (and Find) Solaris Docs

    - by rickramsey
    Just the other day, I went to the recently-released Oracle Solaris 11 library to search for information about the print service changes. I knew there had been changes in Oracle Solaris 11, but could not remember the new approach to printing. So, being the optimist that I (never) am, I went to the Oracle Solaris 11 Information Library on docs.oracle.com and typed "print service" into the search box. Imagine my surprise when the response back was: We did not find any search results for: print service site:download.oracle.com url:/docs/cd/E23824_01. OMG! WTF? Are you kidding me? After throwing a few stuffed animals at my computer screen, I tried again. Is search broken? Well, sort of (and I'm trying to get it fixed). In the meantime, however, there is a reasonably simple user workaround. Possibly unnoticed by most people, there is a Within drop-down menu on the Oracle search results page. If you simply open the Within menu, select Documentation, and click the little magnifying glass again, you (should) get the expected results. Is it perfect? No, but at least it's an improvement over being completely broken. - Janice Critchlow, Information Architect, Systems Website Newsletter Facebook Twitter

    Read the article

  • WordPress page title repeated in SOME pages

    - by cmykrgbb
    I have created a Wordpress site and titles were working just fine. Then, some time and plugins installed later, I noticed that in SOME pages I get the title repeated 2 times. Example of wrong page title: Contact - NAME | NAME Example of normal title: Our Services | NAME Now, if I go to General Settings and change title it will change both, no improvement. SEO by Yoast has the option to reset page titles, but that just removes all titles leaving the current URL as page title, so no good either. Here is the code I originally had: <title><?php wp_title(''); ?><?php if(wp_title('', false)) { echo ' | '; } ?><?php bloginfo('name'); ?></title> Here is the code I am using now: <title><?php wp_title('|'); ?></title> To sum up, I think somewhere in the database there's a wp_title repeated: once using '-' as separator, another one (the current one) using '|'. Any help will be most appreciated, thanks!

    Read the article

  • MDM for Tax Authorities

    - by david.butler(at)oracle.com
    In last week’s MDM blog, we discussed MDM in the Public Sector. I want to continue that thread. After all, no industry faces tougher data quality problems than governmental organizations, and few industries suffer more significant down side consequences to poor operations than local, state and federal governments. One key challenge area is taxation. Tax Authorities face a multitude of IT challenges. Firstly, the data used in tax calculations is increasing in volume and complexity. They must improve service by introducing multi-channel contact centers and self-service capabilities. Security concerns necessitate increasingly sophisticated data protection procedures. And cost constraints are driving Tax Authorities to rely on off-the-shelf software for many of their functional areas. Compounding these issues is the fact that the IT architectures in operation at most revenue and collections agencies are very complex. They typically include multiple, disparate operational and analytical systems across which the sum total of data about individual constituents is fragmented. To make matters more complicated, taxation is not carried out by a single jurisdiction, and often sources of income including employers, investments and other sources of taxable income and deductions must also be tracked and shared among tax authorities. Collectively, these systems are involved in tax assessment and collections, risk analysis, scoring, tracking, auditing and investigation case management. The Problem of Constituent Data Management The infrastructure described above makes it very difficult to create a consolidated representation of a given party. Differing formats and data models mean that a constituent may be represented in one way in one system and in a different way in another. Individual records are frequently inaccurate, incomplete, out of date and/or inconsistent with other records relating to the same constituent. When constituent data must be aggregated and scored, information within each system must be rationalized and normalized so the agency can produce a constituent information file (CIF) that provides a single source of truth about that party. If information about that constituent changes, each system in turn must be updated. There have been many attempts to solve this problem with technology: from consolidating transactional systems to conducting manual systems integration projects and superimposing layers of business intelligence and analytics. All these approaches can be successful in solving a portion of the problem at a specific point in time, but without an enterprise perspective, anything gained is quickly lost again. Oracle Constituent Data Mastering for Tax Authorities: A Single View of the Constituent Oracle has a flexible and long-term solution to the problem of securely integrating and managing constituent data. The Oracle Solution for mastering Constituent Data for Tax Authorities is based on two core product offerings: Oracle Customer Hub and – optionally – Oracle Application Integration Architecture (AIA). Customer Hub is a master data management (MDM) product that centralizes, de-duplicates, and enriches constituent data. It unifies fragmented information without disrupting existing business processes or IT investments. Role based data access and privacy rules guarantee maximum security and privacy. Data is continuously and automatically synchronized with all source systems. With the Oracle Customer Hub managing the master constituent identity, every department can capture transaction activity against the same record, improving reporting accuracy, employee productivity, reliability of constituent analytics, and day-to-day constituent relationships. Oracle Application Integration Architecture provides a collection of core pre-built processes to support out of the box Master Data Governance across Oracle Customer Hub, Siebel CRM, and Oracle E-Business Suite. It also provides a framework to enable MDM integrations with other Oracle and non-Oracle applications. Oracle AIA removes some of the key inhibitors to implementing a service-oriented architecture (SOA) by providing a pre-built SOA-based middleware foundation as well as industry-optimized service oriented applications, all built around a SOA governance model that encourages effective design and reuse. I encourage you to read Oracle Solution for Mastering Constituents Data for Public Sector – Tax Authorities by Roberto Negro. It is an outstanding whitepaper that describes how the Oracle MDM solution allows you to create a unified, reconciled source of high-quality constituent data and gain an accurate single view of each constituent. This foundation enables you to lower the costs associated with data quality and integration and create a tax organization that is efficient, secure and constituent-centric. Also, don’t forget the upcoming webcast on Thursday, February 10th: Deliver Improved Services to Citizens at Lower Cost to your Organization Our Guest Speaker is Ruben Spekle, from Capgemini. He will also provide insight into Public Sector Master Data Management and Case Management implementations including one that was executed for a Dutch Government Agency. If you are interested in how governmental organizations from around the world are using MDM to advance their cause, click here to register for the webcast.

    Read the article

  • CodePlex Daily Summary for Saturday, November 09, 2013

    CodePlex Daily Summary for Saturday, November 09, 2013Popular ReleasesCoolpy: CoolpyI: Coolpy???,????rom??????window phone????????????。???????????Praxis2: Especificaciones de Casos de Uso Iteracción 1: Especificaciones de Casos de Uso Iteracción 1 Responsables Anderson CU Buscar Obra CU Registrar Obra CU Registrar Alquiler Juan Victor CU Buscar Cliente CU Registrar Cliente CU Registrar EntregaMedia Companion: Media Companion MC3.586b: Tv - Multi-episodes restored to MCThere's been a plenty of bug fixes occuring lately, with IMDB changing their info, and some great feed-back by users. But Thanks to Billyad2000, Multi-episodes, are now displaying correctly in Media Companion, complete with all functionality. This was a hard effort, with more than a few dev's in the past having looked at this code to get it working. But, like a light-bulb going off, Billy's managed to massage the code, and restore this much missed function...Dynamics AX 2012 R2 Kitting: AX 2012 R2 CU7 release of Kitting: Here is the AX 2012 R2 CU7 release of kitting. Released both as a XPO and a model.PantheR's GraphX for .NET: GraphX for .NET RELEASE v1.0.1: PLEASE RATE THIS RELEASE IF YOU LIKED IT! THANKS! :) RELEASE 1.0.1 + Changed ExportToImage() parameters: added useZoomControlSurface param that enables zoom control parent visual space to be used for export instead whole GraphArea panel. Using this technique it is possible to export graphs with negative vertices coordinates. + Added common interface IZoomControl for all included Zoom controls + Added new method GraphArea.GenerateGraph() that accepts only optional parameters and will use in...ConEmu - Windows console with tabs: ConEmu 131107 [Alpha]: ConEmu - developer build x86 and x64 versions. Written in C++, no additional packages required. Run "ConEmu.exe" or "ConEmu64.exe". Some useful information you may found: http://superuser.com/questions/tagged/conemu http://code.google.com/p/conemu-maximus5/wiki/ConEmuFAQ http://code.google.com/p/conemu-maximus5/wiki/TableOfContents If you want to use ConEmu in portable mode, just create empty "ConEmu.xml" file near to "ConEmu.exe"Team Foundation Server Upgrade Guide: v3 - TFS 2013 Upgrade Guide: Welcome to the Team Foundation Server Upgrade Guide Quality-Bar Details Documentation has been reviewed by Visual Studio ALM Rangers Documentation has not been through an independent technical review Known issues NoneUpgrading SharePoint section is not included yet. Independent technical review is pending.VidCoder: 1.5.12 Beta: Added an option to preserve Created and Last Modified times when converting files. In Options -> Advanced. Added an option to mark an automatically selected subtitle track as "Default". Updated HandBrake core to SVN 5878. Fixed auto passthrough not applying just after switching to it. Fixed bug where preset/profile/tune could disappear when reverting a preset.Toolbox for Dynamics CRM 2011/2013: XrmToolBox (v1.2013.9.25): XrmToolbox improvement Correct changing connection from the status dropdown Tools improvement Updated tool Audit Center (v1.2013.9.10) -> Publish entities Iconator (v1.2013.9.27) -> Optimized asynchronous loading of images and entities MetadataDocumentGenerator (v1.2013.11.6) -> Correct system entities reading with incorrect attribute type Script Manager (v1.2013.9.27) -> Retrieve only custom events SiteMapEditor (v1.2013.11.7) -> Reset of CRM 2013 SiteMap ViewLayoutReplicator (v1.201...Microsoft SQL Server Product Samples: Database: SQL Server 2014 CTP2 In-Memory OLTP Sample, based: This sample showcases the new In-Memory OLTP feature, which is part of SQL Server 2014 CTP2. It shows the new memory-optimized tables and natively-compiled stored procedures, and can be used to show the performance benefit of in-memory OLTP. Installation instructions for the sample are included in the file ‘awinmemsample.doc’, which is part of the download. You can ask a question about this sample at the SQL Server Samples Forum Composite C1 CMS - Open Source on .NET: Composite C1 4.1: Composite C1 4.1 (4.1.5058.34326) Write a review for this release - help us improve, recommend us. Getting started If you are new to Composite C1 and want to install it: http://docs.composite.net/Getting-started What's new in Composite C1 4.1 The following are highlights of major changes since Composite C1 4.0: General user features: Drag-and-drop images and files like PDF and Word directly from own your desktop and folders into page content Allow you to install Composite Form Builder ...CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.9.0: Implemented Recent Scripts list Added checking for plugin updates from AboutBox Multiple formatting improvements/fixes Implemented selection of the CLR version when preparing distribution package Added project panel button for showing plugin shortcuts list Added 'What's New?' panel Fixed auto-formatting scrolling artifact Implemented navigation to "logical" file (vs. auto-generated) file from output panel To avoid the DLLs getting locked by OS use MSI file for the installation.Home Access Plus+: v9.7: Updated: JSON.net Fixed: Issue with the Windows 8 App Added: Windows 8.1 App Added: Win: Self Signed HAP+ Install Support Added: Win: Delete File Support Added: Timeout for the Logon Tracker Removed: Error Dialogs on the User Card Fixed: Green line showing over the booking form Note: a web.config file update is requiredWPF Extended DataGrid: WPF Extended DataGrid 2.0.0.10 binaries: Now row summaries are updated whenever autofilter value sis modified.xUnit.net - Unit testing framework for C# and .NET (a successor to NUnit): xUnit.net Visual Studio Runner: A placeholder for downloading Visual Studio runner VSIX files, in case the Gallery is down (or you want to downgrade to older versions).VeraCrypt: VeraCrypt version 1.0c: Changes between 1.0b and 1.0c (11 November 2013) : Set correctly the minimum required version in volumes header (this value must always follow the program version after any major changes). This also solves also the hidden volume issueCaptcha MVC: Captcha MVC 2.5: v 2.5: Added support for MVC 5. The DefaultCaptchaManager is no longer throws an error if the captcha values was entered incorrectly. Minor changes. v 2.4.1: Fixed issues with deleting incorrect values of the captcha token in the SessionStorageProvider. This could lead to a situation when the captcha was not working with the SessionStorageProvider. Minor changes. v 2.4: Changed the IIntelligencePolicy interface, added ICaptchaManager as parameter for all methods. Improved font size ...Duplica: duplica 0.2.498: this is first stable releaseDNN Blog: 06.00.01: 06.00.01 ReleaseThis is the first bugfix release of the new v6 blog module. These are the changes: Added some robustness in v5-v6 scripts to cater for some rare upgrade scenarios Changed the name of the module definition to avoid clash with Evoq Social Addition of sitemap providerVG-Ripper & PG-Ripper: VG-Ripper 2.9.50: changes NEW: Added Support for "ImageHostHQ.com" links NEW: Added Support for "ImgMoney.net" links NEW: Added Support for "ImgSavy.com" links NEW: Added Support for "PixTreat.com" links Bug fixesNew ProjectsAppBootloader: ???CS????????????? Let your C\S program more flexible for automatic updatesArduino Visual Studio: Purpose of this project is to demonstrate using Visual Studio 2012 with Atmel chip on a Arduino UNO board.ASP.NET Identity: ASP.NET Identity is the new membership system for building ASP.NET web applications. ASP.NET Identity allows you to add login features to your application and mAsset Maintenance Management Console: A.M.M.C is an attempt at creating an extremely versatile interface tool to maintain assets.AX 2012 R2 SYNC: SYNC for AX 2012 introduces a centralized company concept which holds and manages the enterprise-wide master data for synchronizing across multiple companies.BYOND - Build Your OwN Device (audio synths, effects, DSP, sequencer, VST): Byond is an environment for audio and midi programming in C#. It's available as VST plugin or standalone application.CoveSmushbox: A simple .NET library and a Windows CLI to the SMUSH Box.FORMULA 2.0: Formula specifications are highly declarative logic programs that can express rich synthesis and verification problems.Grostbite Engine: Free 3D game engine.hMailServer from RoundCube: A RoundCube plugin for interacting with hMailServer 5.4B1950. The plugin allows to configure vacation configuration of hMailServer from RoundCube.KDG's IP Reporter: Baisc reporter for local and private IP addresses.LADNS Service Watcher: LADNS Service WatcherLampguiden: LampguidenMedia Recommender Service: We are 6 software engineer students developing a media recommendation service as part of our 3rd semester project. photograp: SPA Photo gallery. Work in progress... Power Buddy: The Windows power tray icon only displays two power plans. If this has bothered you since 2009, Power Buddy is for you. Power Buddy displays all of them.Programming Demos: This project contains demonstration code that may be helpful to people learning VisualBasic .NET.Prototype : Traveling Alone Website: A website aiming to become an online community for solo travelers.ShowDBPool: ShowDBPoolThali: Thali is about making it falling off a log easy for users to run their own services on their own devices by building a peer to peer web.TiendaWebCursoAccentureNet: aWpf PdfReader: This is a pdf reader, development based WPF and MuPDF,You can use the keyboard to operate it.This is pdf reader can save the user's open records.

    Read the article

  • How to have an improved relationship with recruiters?

    - by crosenblum
    I personally, always have problems with recruiter's and their constant spam.. I usually get tons of emails for jobs, not related to what I do. Or they have no idea what I do. Or they say they have a job in my field, but make me go thru hours of paperwork, only to find out they had no real job lead. Or my resume contained a keyword, that they searched for, but that keyword is like 1-10% of what I do, not my main job skill set. My point being is that I want to have a more polite, more accurate, less waste of each other's time. So I want to come up with a form letter, I can create in gmail to automatically send to all recruiter's, to help inform, educate and train them to deal better with me. That way, they know exactly what to send to me, so as to not waste my time. We don't play email/phone tag, just to find out they have no idea what I do, or how to find a job lead that matches that. I want this to be an improvement in my relationship and experience with recruiters, because honestly most of them waste my time. They call me at work, not considering I can't take phone call's at work, and they already had my email address. Mostly they annoy me, but I am tired of having to be rude to get my point across. I want them to immediately make sure they know what I can and have done, (Have you read my resume?) and have actual leads ready to be hired/interviewed soon or now. Any suggestions to how to improve the communication, to avoid wasting each other's time. I certainly hate having to come across as rude or improper, but when they just waste so much of my time, I don't know what else to do. So thank you for your time. Just to be clear, I want your help to write a form letter, that I can send to every recruiter that email's me, how to best work with me, and other people in IT/Web careers.

    Read the article

  • A better way to encourage contributions to OSS

    - by Daniel Cazzulino
    Currently in the .NET world, most OSS projects are available via a NuGet package. Users have a very easy path towards *using* the project right away. But let’s say they encounter some isssue (maybe a bug, maybe a potential improvement) with the library. At this point, going from user to contributor (of a fix, or a good bug repro or even a spike for a new feature) is a very steep and non trivial multi-step process of registering with some open source hosting site (codeplex, github, bitbucket, etc.), learning how to grab the latest sources, build the project, formulate a patch (or fork the code), learn the source control software they use (mercurial, git, svn, tfs), install whatever tools are needed for it, read about the contributors workflow for the project (do you fork &amp; send pull requests? do you just send a patch file? do you just send a snippet? a unit test? etc.), and on, and on, and on. Granted, you may be lucky and already know the source control system the project uses, but in really, I’d say the chances are pretty low. I believe most developers *using* OSS are far from familiar with them, much less with contributing back to various projects. We OSS devs like to be on the cutting edge all the time, ya’ know, always jumping on the new SCC system, the new hosting site, the new agile way of managing work items, bug tracking, code reviews, etc. etc. etc.. But most of our OSS users are largely the “... Read full article

    Read the article

  • WebCenter Customer Spotlight: spectrumK Holding GmbH

    - by me
    Solution Summary spectrumK Holding GmbH was founded in 2007 by various German health insurance funds and national insurance associations and is a service provider for the healthcare market, covering patient care management, financial management, and information management, as well as payment services and legal counseling. spectrumK Holding GmbH business objectives was to implement innovative new Web-based services and solution systems for health insurance funds by integrating a multitude of isolated solutions from different organizations. Using Oracle WebCenter Portal, Oracle WebCenter Content, and Site Studio, the customer created a multiple-portal environment and deployed the 1st three applications for patient receipt, a medication navigator, and disability information. spectrumK Holding GmbH accelerated time-to-market for new features by reducing the development time, achieved 40% development and cost savings using standard modules and realized 80% overall savings using the Oracle multiple portal environment, as compared to individual installations. >> Read the full story

    Read the article

  • Design review for application facing memory issues

    - by Mr Moose
    I apologise in advance for the length of this post, but I want to paint an accurate picture of the problems my app is facing and then pose some questions below; I am trying to address some self inflicted design pain that is now leading to my application crashing due to out of memory errors. An abridged description of the problem domain is as follows; The application takes in a “dataset” that consists of numerous text files containing related data An individual text file within the dataset usually contains approx 20 “headers” that contain metadata about the data it contains. It also contains a large tab delimited section containing data that is related to data in one of the other text files contained within the dataset. The number of columns per file is very variable from 2 to 256+ columns. The original application was written to allow users to load a dataset, map certain columns of each of the files which basically indicating key information on the files to show how they are related as well as identify a few expected column names. Once this is done, a validation process takes place to enforce various rules and ensure that all the relationships between the files are valid. Once that is done, the data is imported into a SQL Server database. The database design is an EAV (Entity-Attribute-Value) model used to cater for the variable columns per file. I know EAV has its detractors, but in this case, I feel it was a reasonable choice given the disparate data and variable number of columns submitted in each dataset. The memory problem Given the fact the combined size of all text files was at most about 5 megs, and in an effort to reduce the database transaction time, it was decided to read ALL the data from files into memory and then perform the following; perform all the validation whilst the data was in memory relate it using an object model Start DB transaction and write the key columns row by row, noting the Id of the written row (all tables in the database utilise identity columns), then the Id of the newly written row is applied to all related data Once all related data had been updated with the key information to which it relates, these records are written using SqlBulkCopy. Due to our EAV model, we essentially have; x columns by y rows to write, where x can by 256+ and rows are often into the tens of thousands. Once all the data is written without error (can take several minutes for large datasets), Commit the transaction. The problem now comes from the fact we are now receiving individual files containing over 30 megs of data. In a dataset, we can receive any number of files. We’ve started seen datasets of around 100 megs coming in and I expect it is only going to get bigger from here on in. With files of this size, data can’t even be read into memory without the app falling over, let alone be validated and imported. I anticipate having to modify large chunks of the code to allow validation to occur by parsing files line by line and am not exactly decided on how to handle the import and transactions. Potential improvements I’ve wondered about using GUIDs to relate the data rather than relying on identity fields. This would allow data to be related prior to writing to the database. This would certainly increase the storage required though. Especially in an EAV design. Would you think this is a reasonable thing to try, or do I simply persist with identity fields (natural keys can’t be trusted to be unique across all submitters). Use of staging tables to get data into the database and only performing the transaction to copy data from staging area to actual destination tables. Questions For systems like this that import large quantities of data, how to you go about keeping transactions small. I’ve kept them as small as possible in the current design, but they are still active for several minutes and write hundreds of thousands of records in one transaction. Is there a better solution? The tab delimited data section is read into a DataTable to be viewed in a grid. I don’t need the full functionality of a DataTable, so I suspect it is overkill. Is there anyway to turn off various features of DataTables to make them more lightweight? Are there any other obvious things you would do in this situation to minimise the memory footprint of the application described above? Thanks for your kind attention.

    Read the article

  • Versioning Strategy for Service Interfaces JAR

    - by Colin Morelli
    I'm building a service oriented architecture composed (mostly) of Java-based services, each of which is a Maven project (in an individual repository) with two submodules: common, and server. The common module contains the service's interfaces that clients can include in their project to make service calls. The server submodule contains the code that actually powers the service. I'm now trying to figure out an appropriate versioning strategy for the interfaces, such that each interface change results in a new common jar, but changes to the server (so long as they don't impact the contract of the interfaces) receive the same common jar. I know this is pretty simple to do manually (simply increment the server version and don't touch the common one), but this project will be built and deployed by a CI server, and I'd like to come up with a strategy for automatically versioning these. The only thing I have been able to come up with so far is to have the CI server md5 the service interfaces.

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >