Search Results

Search found 2063 results on 83 pages for 'jason weber'.

Page 5/83 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • ACORD LOMA Session Highlights Policy Administration Trends

    - by [email protected]
    Helen Pitts, senior product marketing manager for Oracle Insurance, attended and is blogging from the ACORD LOMA Insurance Forum this week. Above: Paul Vancheri, Chief Information Officer, Fidelity Investments Life Insurance Company. Vancheri gave a presentation during the ACORD LOMA Insurance Systems Forum about the key elements of modern policy administration systems and how insurers can mitigate risk during legacy system migrations to safely introduce new technologies. When I had a few particularly challenging honors courses in college my father, a long-time technology industry veteran, used to say, "If you don't know how to do something go ask the experts. Find someone who has been there and done that, don't be afraid to ask the tough questions, and apply and build upon what you learn." (Actually he still offers this same advice today.) That's probably why my favorite sessions at industry events, like the ACORD LOMA Insurance Forum this week, are those that include insight on industry trends and case studies from carriers who share their experiences and offer best practices based upon their own lessons learned. I had the opportunity to attend a particularly insightful session Wednesday as Craig Weber, senior vice president of Celent's Insurance practice, and Paul Vancheri, CIO of Fidelity Life Investments, presented, "Managing the Dynamic Insurance Landscape: Enabling Growth and Profitability with a Modern Policy Administration System." Policy Administration Trends Growing the business is the top issue when it comes to IT among both life and annuity and property and casualty carriers according to Weber. To drive growth and capture market share from competitors, carriers are looking to modernize their core insurance systems, with 65 percent of those CIOs participating in recent Celent research citing plans to replace their policy administration systems. Weber noted that there has been continued focus and investment, particularly in the last three years, by software and technology vendors to offer modern, rules-based, configurable policy administration solutions. He added that these solutions are continuing to evolve with the ongoing aim of helping carriers rapidly meet shifting business needs--whether it is to launch new products to market faster than the competition, adapt existing products to meet shifting consumer and /or regulatory demands, or to exit unprofitable markets. He closed by noting the top four trends for policy administration either in the process of being adopted today or on the not-so-distant horizon for the future: Underwriting and service desktops New business automation Convergence of ultra-configurable and domain content-rich systems Better usability and screen design Mitigating the Risk When Making the Decision to Modernize Third-party analyst research from advisory firms like Celent was a key part of the due diligence process for Fidelity as it sought a replacement for its legacy policy administration system back in 2005, according to Vancheri. The company's business opportunities were outrunning system capability. Its legacy system had not been upgraded in several years and was deficient from a functionality and currency standpoint. This was constraining the carrier's ability to rapidly configure and bring new and complex products to market. The company sought a new, modern policy administration system, one that would enable it to keep pace with rapid and often unexpected industry changes and ahead of the competition. A cross-functional team that included representatives from finance, actuarial, operations, client services and IT conducted an extensive selection process. This process included deep documentation review, pilot evaluations, demonstrations of required functionality and complex problem-solving, infrastructure integration capability, and the ability to meet the company's desired cost model. The company ultimately selected an adaptive policy administration system that met its requirements to: Deliver ease of use - eliminating paper and rework, while easing the burden on representatives to sell and service annuities Provide customer parity - offering Web-based capabilities in alignment with the company's focus on delivering a consistent customer experience across its business Deliver scalability, efficiency - enabling automation, while simplifying and standardizing systems across its technology stack Offer desired functionality - supporting Fidelity's product configuration / rules management philosophy, focus on customer service and technology upgrade requirements Meet cost requirements - including implementation, professional services and licenses fees and ongoing maintenance Deliver upon business requirements - enabling the ability to drive time to market for new products and flexibility to make changes Best Practices for Addressing Implementation Challenges Based upon lessons learned during the company's implementation, Vancheri advised carriers to evaluate staffing capabilities and cultural impacts, review business requirements to avoid rebuilding legacy processes, factor in dependent systems, and review policies and practices to secure customer data. His formula for success: upfront planning + clear requirements = precision execution. Achieving a Return on Investment Vancheri said the decision to replace their legacy policy administration system and deploy a modern, rules-based system--before the economic downturn occurred--has been integral in helping the company adapt to shifting market conditions, while enabling growth in its direct channel sales of variable annuities. Since deploying its new policy admin system, the company has reduced its average time to market for new products from 12-15 months to 4.5 months. The company has since migrated its other products to the new system and retired its legacy system, significantly decreasing its overall product development cycle. From a processing standpoint Vancheri noted the company has achieved gains in automation, information, and ease of use, resulting in improved real-time data edits, controls for better quality, and tax handling capability. Plus, with by having only one platform to manage, the company has simplified its IT environment and is well positioned to deliver system enhancements for greater efficiencies. Commitment to Continuing the Investment In the short and longer term future Vancheri said the company plans to enhance business functionality to support money movement, wire automation, divorce processing on payout contracts and cost-based tracking improvements. It also plans to continue system upgrades to remain current as well as focus on further reducing cycle time, driving down maintenance costs, and integrating with other products. Helen Pitts is senior product marketing manager for Oracle Insurance focused on life/annuities and enterprise document automation.

    Read the article

  • Sql order by within a group by with aggregate

    - by NG
    Say I have Team/Name/Some number Cardinals Jason 8 Cardinals Chris 5 Yankees Joba 6 Cubs Carlos 6 Cardinals Chris 6 And I want Cardinals Jason 8 Cardinals Chris 11 Cubs Carlos 6 Yankees Joba 6 So, what I'm doing is grouping by team, grouping by name, summing by some number However, within cardinals I want to make sure the names are in a particular order. If I just do an "order by name desc" for example then the the whole grouping gets ignored. So how can I order within a group.

    Read the article

  • Including email, IMs, configs, etc. in documentation or notes

    - by Jason Antman
    The shop I work in is pretty laid-back. We're on a documentation kick, only because historically we've been very bad with it. We do a lot of our brainstorming in face-to-face meetings, and also do a lot of communication via IM in addition to email. While I'm usually pretty good about documentation and keeping copious lab notes, I just finished a build of a host and spent hours searching through IMs, emails, files on my workstation, etc. to pull out anything I missed in my lab notes, which formed a large amount of the basis for the internal documentation. Does anyone have any thoughts on, aside from manually saving things to a project directory, managing various data sources (especially email and IM) and tracking them on project basis? Ideally, I'd like an easy way to put copies of emails, IM logs, etc. into a project-specific directory on my workstation and then just have a cron job that syncs that up with a shared folder. This isn't really a candidate for anything more advanced, as the bulk of the data will be copies of configs, code, etc. Here are the big restrictions: Email is via a centralized Zimbra install, so nothing can happen server-side. My workstation is Linux. Aside from writing Pidgin and Thunderbird plugins that let me tag chats and emails as belonging to a project, and then copy them to the appropriate place... any thoughts? Suggestions? Thanks, Jason

    Read the article

  • ASP.Net Response Filter Clashing with SharePoint 2010 Publishing Site Defaults

    - by Jason Weber
    Hello everyone, I'm debugging an HttpModule with an ASP.NET response filter. This dynamically rewrites portions of rendered SharePoint WCM pages. The publishing pages render fine in SP2007 on both Server 2003 and Server 2008. However the equivalent pages fail to render in SP2010 B2 on Server 2008 R2 / IIS7. The following error is returned by ASP.NET: Post cache substitution is not compatible with modules in the IIS integrated pipeline that modify the response buffers. Either a native module in the pipeline has modified an HTTP_DATA_CHUNK structure associated with a managed post cache substitution callback, or a managed filter has modified the response. This error is consistent with KB #2014472. However: Caching is disabled for anonymous & authenticated access at the site collection level There do not appear to be any Substitution controls on either the master or layout page The IIS 7 settings are all stock default This is happening e.g. on /pages/default.aspx. It seems likely I'm missing something cache related...but what?

    Read the article

  • Tinyurl API Example - Am i doing it right :D

    - by Paul Weber
    Hi ... we use super-long Hashes for the Registration of new Users in our Application. The Problem is that these Hashes break in some Email Clients - making the Links unusable. I tried implementing the Tinyurl - API, with a simple Call, but i think it times out sometimes ... sometimes the mail does not reach the user. I updated the Code, but now the URL is never converted. Is Tinyurl really so slow or am i doing something wrong? (I mean hey, 5 Seconds is much in this Times) Can anybody recommend me a more reliable service? All my Fault, forgot a false in the fopen. But i will leave this sample of code here, because i often see this sample, wich i think does not work very reliable: return file_get_contents('http://tinyurl.com/api-create.php?url='.$u); This is the - i think fully working sample. I would like to hear about Improvements. static function gettinyurl( $url ) { $context = stream_context_create( array( 'http' => array( 'timeout' => 5 // 5 Seconds should be enough ) ) ); // get tiny url via api-create.php $fp = fopen( 'http://tinyurl.com/api-create.php?url='.$url, 'r', $context); // open (read) api-create.php with long url as get parameter if( $fp ) { // check if open was ok $tinyurl = fgets( $fp ); // read response if( $tinyurl && !empty($tinyurl) ) // check if response is ok $url = $tinyurl; // set response as url fclose( $fp ); // close connection } // return return $url; // return (tiny) url }

    Read the article

  • ASP.Net Response Filter Causing SharePoint 2010 "Unexpected Error"

    - by Jason Weber
    Hello everyone, I'm debugging an HttpModule with an ASP.NET response filter. This dynamically rewrites portions of rendered SharePoint WCM pages. The publishing pages render fine in SP2007 on both Server 2003 and Server 2008. However the equivalent pages fail to render in SP2010 B2 on Server 2008 R2. The generic "An unexpected error has occurred message" page is displayed. This error only happens when the response filter is applied to an .aspx page. Other page types, such as .css, render fine on this platform. This error also happens when the response filter does not modify the page at all (pure pass-through). This KB article seems very closely related: http://support.microsoft.com/kb/2014472. However, this same error occurs with caching disabled. I see no related entries in any of the following: ULS for SP, Event Log, Failed Request Tracing (IIS7). Running under the debugger suggests that the custom code is not raising any exceptions. Any help or insight would be greatly appreciated.

    Read the article

  • Why won't Silverlight handle the conversion of my custom float property

    - by Jeff Weber
    In a Silverlight 4 project I have a class that extends Canvas: public class AppendageCanvas : Canvas { public float Friction { get; set; } public float Restitution { get; set; } public float Density { get; set; } } I use this canvas in Blend by dragging it onto another control and setting the custom properties: When I run the app, I get the following error when InitializeComponent is called on the control containing my custom canvas: I'm not sure why Silverlight isn't able to convert this property from it's string representation in Xaml, to the float that it is. Anyone have any ideas?

    Read the article

  • What is the best way get and hold property reference by name in c#

    - by Jeff Weber
    I want to know if there is a better way (than what I'm currently doing) to obtain and hold a reference to a property in another object using only the object and property string names. Particularly, is there a better way to do this with the new dynamic functionality of .Net 4.0? Here is what I have right now. I have a "PropertyReference<T>" object that takes an object name and property name in the constructor. An Initialize() method uses reflection to find the object and property and stores the property Getter as an Action<T> and the property Setter as an Func<T>. When I want to actually call the property I do something like this: int x = _propertyReference.Get(); or _propertyReference.Set(2); Here is my PropertyReference<T> code. Please dissect and make suggestions for improvement. using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Reflection; using System.Xml; namespace WindowsFormsApplication2 { public class PropertyReference<T> : IPropertyReference { public string ComponentName { get; set; } public string PropertyName { get; set; } public bool IsInitialized { get { return (_action != null && _func != null); } } Action<T> _action; Func<T> _func; public PropertyReference() { } public PropertyReference(string componentName, string propertyName) { ComponentName = componentName; PropertyName = propertyName; } public void Initialize(IEntity e) { Object component = e.GetByName(ComponentName); if (component == null) return; Type t = e.GetByName(ComponentName).GetType(); PropertyInfo pi = t.GetProperty(PropertyName); _action = (T a) => pi.SetValue(component, a, null); _func = () => (T)pi.GetValue(component, null); } public void Reset() { _action = null; _func = null; } public void Set(T value) { _action.Invoke(value); } public T Get() { return _func(); } } } Note: I can't use the "Emit" functionality as I need this code to work on the new Windows Phone 7 and that does not support Emit.

    Read the article

  • JFrame the same shape as an Image / Program running in background

    - by Jan Weber
    My question is simple, the solution surely not. I am looking for a way to shape a JFrame the same as an Image it will be displaying. By shape I mean the shape of the pixels that have an alpha != 0. I've already found a working example using a GeneralPath object, but it created ~110000 "nodes" for an Image of about 500*400, so starting the JFrame took more than 2 minutes, which is definitely not the desired effect, the startup should be in under 2 seconds. Thanks for your time.

    Read the article

  • How do I read a public twitter feed using .Net

    - by Jeff Weber
    I'm trying to read the public twitter status of a user so I can display it in my Windows Phone application. I'm using Scott Gu's example: http://weblogs.asp.net/scottgu/archive/2010/03/18/building-a-windows-phone-7-twitter-application-using-silverlight.aspx When my code comes back from the async call, I get a "System.Security.SecurityException" as soon as I try to use the e.Result. I know my uri is correct because I can plop it in the browser and get good results. Here is my relavent code: public void LoadNewsLine() { WebClient twitter = new WebClient(); twitter.DownloadStringCompleted += new DownloadStringCompletedEventHandler(twitter_DownloadStringCompleted); twitter.DownloadStringAsync(new Uri("http://api.twitter.com/1/statuses/user_timeline.xml?screen_name=krashlander")); } void twitter_DownloadStringCompleted(object sender, DownloadStringCompletedEventArgs e) { XElement xmlTweets = XElement.Parse(e.Result); //exception thrown here! var message = from tweet in xmlTweets.Descendants("status") select tweet.Element("text").Value; //Set message and tell UI to update. //NewsLine = message.ToString(); //RaisePropertyChanged("NewsLine"); } Any ideas anyone?

    Read the article

  • iPhone simulator app crashes when appending a string

    - by Franklyn Weber
    Hi, I'm a complete novice, so I'm probably missing something really easy, but I can't get my string appending to work. I add the 3rd character to typedDigit & it crashes - the method is called fine and typedDigit will get to 2 characters long. I think everything is declared properly in the header file. Code is - -(IBAction)digitPressed:(UIButton *)sender { NSString *digit = [[sender titleLabel] text]; // in this case, "0" - "9" if (userIsInMiddleOfTyping) { // typedDigit is already at least 1 character long typedDigit = [typedDigit stringByAppendingString:digit]; } else { // first character of typedDigit typedDigit = digit; userIsInMiddleOfTyping = YES; } } Many thanks for any help!

    Read the article

  • Is putting the javascript before the closing body tag okay on an asp.net website?

    - by Jason Weber
    I pretty much stated what I have to ask. But is taking all of your external .js files and putting them before the closing body tag on your master pages okay on an asp.net website? I'm just going off of what yslow and google speed have been showing. I can't combine these javascripts, so I'm trying to load them "after page load", but doing so makes them useless; some of my jquery things don't work. I moved my .js files above the opening body tag, and they work. What am I doing wrong? And what could I do to load my .js files after page load? Thanks for any advice anybody can offer!

    Read the article

  • CUDA: cudaMemcpy only works in emulation mode.

    - by Jason
    I am just starting to learn how to use CUDA. I am trying to run some simple example code: float *ah, *bh, *ad, *bd; ah = (float *)malloc(sizeof(float)*4); bh = (float *)malloc(sizeof(float)*4); cudaMalloc((void **) &ad, sizeof(float)*4); cudaMalloc((void **) &bd, sizeof(float)*4); ... initialize ah ... /* copy array on device */ cudaMemcpy(ad,ah,sizeof(float)*N,cudaMemcpyHostToDevice); cudaMemcpy(bd,ad,sizeof(float)*N,cudaMemcpyDeviceToDevice); cudaMemcpy(bh,bd,sizeof(float)*N,cudaMemcpyDeviceToHost); When I run in emulation mode (nvcc -deviceemu) it runs fine (and actually copies the array). But when I run it in regular mode, it runs w/o error, but never copies the data. It's as if the cudaMemcpy lines are just ignored. What am I doing wrong? Thank you very much, Jason

    Read the article

  • Article search engine in php

    - by Jason
    Hello, I am using sphinx as a search engine on my website its working perfect and I have no complain with it. The only thing it lacks is, it does not allow me to search articles whose query length is more than 15 words. I know in reality people don't use more than 3-4 words i want to use it for finding duplicate contents. I was wondering if there is any alternative solution to sphinx. I want to cope with duplicate contents. My main articles table is in innodb but I am also caching articles into MyISAM table as well for full text searching but when I search an article it takes ages to perform one search. Its not the query problem, i think mysql lacks the fulltext searching facility. Thanks Jason

    Read the article

  • MATLAB: dealing with java.lang.String

    - by Jason S
    I seem to be stuck in Kafka-land, with a java.lang.String that I can't seem to use in MATLAB functions: K>> name name = Jason K>> sprintf('%s', name) ??? Error using ==> sprintf Function is not defined for 'java.lang.String' inputs. K>> ['my name is ' name] ??? Error using ==> horzcat The following error occurred converting from char to opaque: Error using ==> horzcat Undefined function or method 'opaque' for input arguments of type 'char'. how can I get a java.lang.String to convert to a regular MATLAB character array?

    Read the article

  • Sharepoint calendar webpart change views

    - by Jason
    Hi there, I created a calendar list, and I added the calendar web part in another web part page. I noticed that i can not change the view without going to edit mode, then go to modify shared web part. But in the home page of the calendar. There is a same calendar web part with a drop down menu to change views. Also, there is a small calendar connected to the main calendar web part in the left navigation area. I don't know how to add it to my web part page. How can i make it look the same on my web part page? Thanks, Jason

    Read the article

  • Setting a cookie before Javascript Redirection

    - by Jason
    Hello, I have a Rails app where I set a set a session variable the moment a user lands on my site with the referer and the page they hit. Additionally, I have Google Optimizer sending traffic from my homepage to various landing pages. The problem is that I think Google Optimizer is sending users away before the cookie is set. Is that even possible? I believe that the cookie is set from the HTTP Header, which must have fully loaded before Google's Javascript has even loaded. Thanks, Jason

    Read the article

  • Named Blueprints with factory_girl

    - by Jason Nerer
    I am using Factory Girl but like the machinist syntax. So I wonder, if there is any way creating a named blueprint for class, so that I can have something like that: User.blueprint(:no_discount_user) do admin false hashed_password "226bc1eca359a09f5f1b96e26efeb4bb1aeae383" is_trader false name "foolish" salt "21746899800.223524289203464" end User.blueprint(:discount_user) do admin false hashed_password "226bc1eca359a09f5f1b96e26efeb4bb1aeae383" is_trader true name "deadbeef" salt "21746899800.223524289203464" discount_rate { DiscountRate.make(:rate => 20.00) } end DiscountRate.blueprint do rate {10} not_before ... not_after ... end Is there a way making factory_girl with machinist syntax acting like that? I did not find one. Help appreciated. Thx in advance Jason

    Read the article

  • Click GEvent.addListener with jquery

    - by Jason
    Created a google map with GMap2 and put pinpoints on there that open up a balloon with the address when the pinpoint is clicked. I would like users to be able to click text on the page itself and use jquery to open up the corresponding balloon. However I can't figure out the ID to use to call a jquery click event. Basically I've got a store listing down the left side and when user clicks store name I want it to open up the corresponding balloon. GEvent.addListener(marker_500, "click", function () { map.openInfoWindowHtml(point, myHtml); } Any idea what element tied to this click event is? Tried $("#marker_500").click(); And that doesn't work. Also tried alerting $(this).attr('id'); inside the click function and that is undefined. thanks jason

    Read the article

  • Javascript array of href's

    - by Jason
    Hi, I am trying to create an array with different href's to then attach to 5 separate elements. This is my code: var link = new Array('link1', 'link2', 'link3', 'link4', 'link5'); $(document.createElement("li")) .attr('class',options.numericId + (i+1)) .html('<a rel='+ i +' href=\"page.php# + 'link'\">'+ '</a>') .appendTo($("."+ options.numericId)) As you can see I am trying to append these items from the array to the end of my page so each link will take the user to a different section of the page. But i have not been able to do this. Is there a way to to create elements with different links? I am new to javascript so I am sorry if this doesn't make a whole lot of sense. If anyone is confused by what i am asking here I can try to clarify if I get some feedback. Any solutions would be greatly appreciated. Thanks, jason

    Read the article

  • ASP.NET Web User Control with Javascript used multiple times on a page - How to make javascript func

    - by Jason Summers
    I think I summed up the question in the title. Here is some further elaboration... I have a web user control that is used in multiple places, sometimes more than once on a given page. The web user control has a specific set of JavaScript functions (mostly jQuery code) that are containted within *.js files and automatically inserted into page headers. However, when I want to use the control more than once on a page, the *.js files are included 'n' number of times and, rightly so, the browser gets confused as to which control it's meant to be executing which function on. What do I need to do in order to resolve this problem? I've been staring at this all day and I'm at a loss. All comments greatly appreciated. Jason

    Read the article

  • glibc backtrace - can't redirect output to file

    - by Jason Antman
    Hi, I'm in the process of debugging a C program (that I didn't write). I have all of the internal debugging tools (a whole bunch of printf's) enabled, and I wrote a small PHP script that uses proc_open() and just grabs both stdout and stderr, and time-coordinates them in one file. At the moment, the binary is dieing with a realloc() error that's caught by glibc, and a glibc backtrace is printed, beginning with: *** glibc detected *** /sbin/rsyslogd: realloc(): invalid next size: 0x00002ace626ac910 *** Here's the thing I don't understand: I've confirmed that the PHP script is catching both stdout and stderr from the binary's process and writing them to the correct files, but this backtrace is still printed to the console. Where is this coming from? Is there some magical output channel other than stdout and stderr? Any ideas on how I go about capturing this backtrace to a file, or sending it out with stderr? Thanks, Jason

    Read the article

  • Problem with ajax and posting non-latin characters

    - by jason
    Posting non-latin based languages with ajax + jquery doesn't save to mysql the correct text. What I have done is this: I am getting multiple translated words from Google's translation api. The ajax request is showing the correct translations for all languages. But when i try and insert this into the db it shows up in php my admin as garbled text I added AddDefaultCharset UTF-8 to .htaccess file on the root. I tried setting the header in php to utf-8 and this did not work. I have tried adding a contentType to ajax setup but this didn't work also. Any suggestions appreciated. jason

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >