Tuesday 31 May 2016

This Android phone will put military-grade security in your pocket — for $14,000

This Android phone will put military-grade security in your pocket — for $14,000
The Samsung Galaxy S7 Edge costs less than $800. The new Solarin Android smartphone costs £9,500, which is about $14,000.
The team behind Solarin, Sirin Labs, announced the details on its super-expensive smartphone Tuesday, which the company said is designed for international businesspeople who want to keep sensitive information private. According to the company's press release, the phone uses the same communication encryption technology as the military.
That kind of security is enticing to some people, but $14,000 goes farther than encryption. Inside the Solarin is a base storage of 128GB, 4GB of RAM and a Snapdragon 810 2GHz eight-core processor. It also boasts five antennas and 24 LTE bands, reaching data speeds of up to 4.6 Gbps.
That all sounds pretty impressive, but it pretty much matches the Samsung Galaxy S7 Edge's hardware specs. The S7 Edge actually has a better processor, the Snapdragon 820, although it comes up short on storage with only 32GB.
This Android phone will put military-grade security in your pocket — for $14,000

Not only that, Solarin is a 5.5-inch phone but weighs more than half a pound. That's pretty heavy.
Besides the built-in encryption, the difference-makers of Solarin are on the back. There's a 23.8-megapixel camera and fingerprint sensor surrounded by Italian leather, as well as a special security switch you can press to ensure maximum security during phone calls or while sending messages.
This Android phone will put military-grade security in your pocket — for $14,000

The phone clearly isn't made for the average person, Italian leather or not. For a smartphone that wouldn't perform much better than most new phones on the market, the $14,000 price tag hardly seem justifiable for those of us who are lacking super-cool secrets. But for people with high-value information on their smartphones, that high price could look pretty enticing compared to having valuable information stolen.
Whether or not the phone will succeed is questionable, because how many people have this kind of high-value information that they're willing to shell out that much money for? At least some people think it will succeed, given that Sirin Labs got $72 million in seed funding from a few people last year, including Moshe Hogeg, founder of the bafflingly limited messaging app Yo

Top Five Free Business Apps You’ve Never Heard About.




This list is purely subjective, on one hand. On the other hand, I really do think that these are the best free business apps because I’ve looked at other lists and they all come with usual suspects, like Evernote, Dropbox or Skype. How very enlightening. NOT. So here is my version.

   1. Bitrix24 
Bitrix24


I hearby declare Bitrix24 the best free business app that you probably don’t know about. In fact, Bitrix24 is 35+ business tools inside a single platform – CRM, project management, employee management, call center, virtual office, cloud call center, document management, invoicing, email marketing and much more. There are three things that Bitrix24 does exceptionally well. First is CRM and client management. Second, collaboration, planning and project management. Third, human resources information system. One review called Bitrix24 ‘business management platform and I think that’s the best description for it. Or you can think of Bitrix24 as Salesforce+Skype+Dropbox+Basecamp in one. Totally free for 12 users. Available for iOS, Android, Web, Unix, Linux, Windows and MacOS.

        2.  Qwirl 

Qwirl

Qwirl is a bit difficult to explain, if you never tried it. Essentially, it turns any business document into an interactive webpage.  Think of Qwirl as free quotes and proposal software. Or a way to create beautiful presentations for your clients. Or reports for investors. That’s Qwirl. Founded by ex-Googlers, this service has all the chances to become the next Prezi.

        3.  Wave Accounting

Wave Accounting

I wish every service would have pricing page like WaveApps.  The service is absolutely and totally free (and it’s a very powerful accounting platform). It’s especially great at scanning receipts, which is done by simply using your mobile phone camera once you install the app called Receipts by Wave. Unlike Expensify which handles only expenses, as the name implies, Wave Accounting can be used as a payroll and invoicing solution.

        4. Bizagi 

Bizagi


Bizagi is a free business process modeler solution used by more than 500,000 users worldwide. It is 100% based on BPMN notations, which is a de facto standard for business process automation nowadays. Granted not very many smaller businesses think about processes, standards and procedures, but if you are ready to grow, definitely give Bizagi a try.

         5.   Schedulehead

Schedulehead


  Schedulehead is an employee scheduling service. As long as used by 10 employees and 1 administrator, it’s absolutely free of charge. The free service includes knowledge base and email notifications. If you operate your business with tablets and smartphones, ShiftCalendar may be a better choice, but for non-mobile workforce I think Schedulehead works better.  You may also want to consider upgrading to paid plans if you want your employees to receive text messages with their schedule updates. 

        if you find this post interesting , do share it on facebook and google plus.



Facebook may add end-to-end encryption to Messenger, report says

               
Facebook may add end-to-end encryption to Messenger, report says



Facebook Messenger may follow WhatsApp in implementing stronger encryption, according to a new report. 
The social network could add end-to-end encryption to its Messenger app later this year, even though it may come at the expense of some of Facebook's artificial intelligence features, The Guardian reports
The new encryption measures, which would make messages sent through Messenger more secure, will reportedly roll out as an "optional" encrypted mode that users would need to opt into in order to enable it. If true, that would differ from the encryption recently implemented by Facebook-owned WhatsApp, which turned on end-to-end encryption by default on all its apps last month
The difference, according to The Guardian, is that Messenger's encryption would come at the expense of some of its newer artificial intelligence efforts, including its bots. Facebook's Messenger bots (and its experimental assistant, M) learn from users' messages in order to get better at replying to requests. But, as the report points out, this requires that messages are stored on company servers, while fully encrypted messages can only be seen by the sender or receiver. 


A Facebook spokesperson declined tocomment on the report but earlier reportshave also suggested Facebook has been looking into increasing the encryption of Messenger in the wake of Apple's very public battle with FBI over its use of encryption. 
Facebook isn't the only company looking to increase its security, even as it pushes harder into AI. Allo, Google's upcoming messaging app, will also offer extra security as an "opt-in" feature for those who don't want to take advantage of the more AI-focused features like smart replies and Google's Assistant





 if you find this post interesting , do share it on facebook and google plus.

Twitter is hiring in Singapore for its first data science team outside the U.S.


Twitter is hiring in Singapore for its first data science team outside the U.S
SINGAPORE — Twitter is searching for a team of data scientists to be based in Singapore, in its first effort to create such a team outside of the U.S.
Twitter's Linus Lee, who will be moving from Twitter's San Francisco headquarters to Singapore to head the team here, told Mashable the team will be between five and 10 strong, and will focus squarely on growing markets outside the U.S. — where 80% of the company's users now are.
Lee, a Singaporean who left the island state 10 years ago for undergraduate studies at Stanford, was one of Twitter's early data science hires four years ago.
In the time since he's joined, Twitter's data science teams have multiplied to encompass a business-focused team — that advises the finance side on trends — and other teams within engineering, although Lee wouldn't specify how many data scientists Twitter now has. 
His own team in the U.S. works with the product development folks to shape the user experience, depending on what the data reveals about user behaviour. The new Singapore team will also be steered in this direction, and will focus on how users in emerging markets with less capable devices on slower networks act on the network.
For instance, in many of these markets such as India, users operate on 2.5G networks, which are far slower than the 4G speeds that mature markets are accustomed to. Based on user bounce rates shown by the data, the Twitter app now adjusts for slow connections and downgrades images or videos, to help the user experience along, Lee said.
"The data science team looks at the data, models it and recommends so the product team can test outcomes," he said "Previously, emerging markets teams had to beg for data science support, but now they'll have a dedicated resource (in Singapore)."

What Twitter is looking for

The job ad for the new Singapore team is already up on Twitter's site, and Lee said they're looking to hire both junior and senior analysts.
"I don't think age is a factor, but what separates people is the ability to apply the (data) tools and techniques in the right way. To understand what the business needs, and apply it.
"We need independent thinkers, because they need to suggest (features) to our product people," he said.
He noted that Singapore continues to produce more qualified statisticians out of schools here, so the scene is ripe for hiring. But many here will be inexperienced compared with counterparts in Silicon Valley when it comes to handling huge data sets, because there are fewer opportunities here to handle data the scale of what Twitter has.

He said Twitter now has 310 million monthly actives, and data from 500 million logged-out visitors per month. Hundreds of millions of tweets are sent daily, so the vast amount of unstructured data gets crunched continuously by the social media giant.
Tens of thousands of Hadoop jobs run overnight in batches across Twitter's data centres holding petabytes of information, he said.
This effort to chase new users comes as the company struggles to keep user growth up for investors. The topic has been a pain in its side in recent years, and the firm has started to tinker more with the fundamental experience in hopes of jumpstarting a stalled user base.
This includes algorithmically tweaked timelines — which, predictably, drew irefrom users — and considerations of lifting the 140-character limit, which has long been considered the hallmark of the service's appeal.
"We seek to understand different users," Lee explained. Users in Japan, for instance, mostly choose to be anonymous on Twitter, whereas those in the West prefer to tie their real-life identities to the service.
"In the last four years, the proportion of users outside the U.S. has only been growing. You can't just apply what you know (about the U.S. base) over here," he said.
Have something to add to this story? Share it in the comments

Those rumors of an all-glass iPhone just got a lot more realistic

                 

Those rumors of an all-glass iPhone just got a lot more realistic




Even though it’s way too early to be talking about the 2017 iPhone, rumors about the device continue to persist.
Thus far, the rumors have suggested that the 2017 iPhone (maybe the iPhone 7S?) will return to an all-glass design. This was first floated by Apple analyst Ming-Chi Kuo back in March. 
Noted Apple watcher John Gruber shared on his podcast The Talk Show that his sources (which we believe are quite good) indicated the next next iPhone would have an all-glass case, edge-to-edge display. Gruber has even heard that the home button/Touch ID sensor might be integrated into the display itself.
But this was all just speculation. And frankly, although we don’t doubt that Apple has been working on the 2017 iPhone for quite some time (the lead-in time for devices like the iPhone and iPad are often years, not months), it’s odd for such concrete details to leak so early.
But now those rumors seem even more likely to be true. At its annual shareholder meeting, the chairman of longtime iPhone supplier Catcher Technology Allen Horng said, indicated that the 2017 iPhone would indeed have an all-glass display. Catcher has been the chassis maker for iPhone devices for a number of years and the move to a non-aluminum chassis could mean Catcher would lose business.
Speaking with reporters last week, Horng said that, “As far as I know, only one [iPhone] model will adopt glass casing next year.” He added that he didn’t think this would impact his company’s revenue because “glass casing still needs a durable metal frame which requires advanced processing technology and would not be cheaper than the current model.”

Stay skeptical

We should note that comments from a supplier CEO – even though interesting – don’t confirm anything about Apple’s future iPhone plans. For all we know, Horng could be reading the same reports and rumors as the rest of us. It’s very unlikely that Apple has decided on all of the parts for its 2017 iPhone.
Still, the fact that a supplier is even commenting on these kinds of issues gives credence to the idea that the 2017 iPhone will have a brand-new design.
On Facebook Live last week, I theorized that we might see a big redesign with the 2017 iPhone, simply because it will be the 10th anniversary of the iconic, groundbreaking, industry-shifting product.
Everything we’ve heard about the iPhone 7 suggests that Apple won’t be changing the design dramatically from the iPhone 6/iPhone 6S – you know, except for that whole no headphone jack thing.
And now, it’s time for my obligatory prediction for the next iPhone (and the next next iPhone, and the next next next iPhone):
  • It will be Apple’s fastest iPhone ever
  • It will have Apple’s most-advanced optics
  • There will be at least one extra feature you can’t get on any other iPhone.
  • It will come in rose gold. 
if you find this post interesting , do share it on facebook and google plus.

The basic ingredients for life have been found in a comet's atmosphere

 for Space.com

For the first time, scientists have directly detected a crucial amino acid and a rich selection of organic molecules in the dusty atmosphere of a comet, further bolstering the hypothesis that these icy objects delivered some of life's ingredients to Earth.
The amino acid glycine , along with some of its precursor organic molecules and the essential element phosphorus, were spotted in the cloud of gas and dust surrounding Comet 67P/Churyumov-Gerasimenko by the Rosetta spacecraft, which has been orbiting the comet since 2014. While glycine had previously been extracted from cometary dust samples that were brought to Earth by NASA's Stardust mission, this is the first time that the compound has been detected in space, naturally vaporized.
The discovery of those building blocks around a comet supports the idea that comets could have played an essential role in the development of life on early Earth, researchers said.
"With all the organics, amino acid and phosphorus, we can say that the comet really contains everything to produce life — except energy," said Kathrin Altwegg of the University of Bern in Switzerland, the principal investigator for the Rosetta mission's ROSINA instrument.
"Energy is completely missing on the comet, so on the comet you cannot form life," Altwegg told Space.com. "But once you have the comet in a warm place — let's say it drops into the ocean — then these molecules get free, they get mobile, they can react and maybe that's how life starts." 

Getting a glimpse

Glycine, one of the simplest amino acids, is usually bound up as a solid, which means it's difficult to detect from afar, Altwegg said.
While scientists have searched for glycine through telescopes in star-forming regions of the sky, the newly reported detection marks the first sighting of the compound in space. In this case, the orbiting Rosetta was close enough to pick up the glycine released by the comet's dust grains as they heated up in the sun.
The study is a powerful confirmation of earlier, earth-bound detections of life's building blocks in comet and meteor material.
"We know the Earth was pretty heavily bombarded both with asteroidal material and cometary material," said Michael A'Hearn, a comet researcher at the University of Maryland who was not involved in the new study.
"There have been various claims of amino acids in meteorites, but all of them have suffered from this problem of contamination on Earth. The Stardust [samples] — which are from a comet, not an asteroid — are probably the least susceptible to the terrestrial contamination problem, but even there the problem is severe," A'Hearn told Space.com"I think they [Stardust] really did have glycine, but this is a much cleaner detection in many ways."

Cooking up life

Amino acids form the basis of proteins, which are complexly folded molecules that are critical to life on Earth. Altwegg's team searched for other amino acids around the comet as well, but located only glycine — the only one that can form without liquid water (as in the frigid reaches of space).
The glycine probably didn't form on the comet itself, Altwegg said, but rather in the broad stretches of dust and debris that made up the solar system before planetary bodies formed. 
"The solar system was made out of material which formed in a disk, in a solar nebula," Altwegg said. "In these clouds, it's pretty cold, so the chemistry you do there is catalytic chemistry on the dust surfaces. And these very small dust grains [1 micron in size] are very good to lead to organic chemistry. This is also done in the lab." Earth itself was far too hot for similar delicate amino acids to survive its formation, Altwegg said; only the smallest solar system bodies stayed cold.
So glycine formed during that time could have provided a boost to newly forming life if it was delivered to Earth by comets.
"It's not that it couldn't have formed on Earth — it certainly could — it's just that it didn't have to," A'Hearn said. "Basically, the Earth got a head start."
Other, more complex amino acids require liquid water, and so would have likely formed on Earth itself, Altwegg said. This idea is supported by the fact that Rosetta has not identified any amino acids other than glycine near Comet 67P.
Phosphorus is also vital to life as we know it. Among other things, the element is a key constituent of DNA and adenosine triphosphate (ATP), a molecule that stores the chemical energy used by cells.
Rosetta is the first spacecraft to bring the right kind of instrument up close to a comet; future probes could examine other comets or even bring frozen samples back for analysis, to see how representative 67P is of comets in general.
But in the meantime, the team is still working on understanding all the organics they found and analyzing them further. "And I think the next step goes to the biochemists, how to make something meaningful out of this," Altwegg said.
The discovery is also significant to researchers trying to understand the conditions of the early solar system, when the comet's nucleus first came together, not to mention conditions when the early Earth was bombarded by similar comets.
"For astrobiology, it's a very important measurement," Altwegg said. "And it's not only life on Earth; the material in comets has been formed in a protostellar cloud, and what could have happened here in our protostellar cloud could have happened everywhere in the universe."
"Then you can ask yourself the question: How many Earths are there, how many evolved life or re-evolved life?" she added.
The new work was detailed in the journal Science Advances May 27. 



       This article originally published at             Space.com here






 if you find this post interesting , do share it on facebook and google plus.

Intel launches first-ever 10-core desktop processor


Intel's most powerful desktop processor has 10 cores, but it'll cost you
Throughout the 1990s and early 2000s, processor manufacturers raced to offer processors with ever-increasing clock speeds, which more or less equaled performance.

Then, when clock speeds hit a wall at about 4 GHz due to a variety of factors, they took a back seat, and manufacturers started a race to increase the number of processor cores in a single CPU. Intel introduced its first dual core processor for home use in 2006, and it took about seven years for the number of cores in processors for desktops and laptops to reach eight. 
Now, at the Computex trade show in Taipei on Monday, Intel launched its first 10-core processor aimed at home users (the company already sells 10-core Xeon processors, but those are for professional use). 
A part of Intel's 14-nanometre Broadwell-E chip family, the Intel Core i7-6950X Extreme Edition is primarily aimed for gamers and enthusiasts. Its 10 cores (with 2 threads per core) run at a 3GHz base frequency, but the CPU is unlocked and can be overclocked to higher speeds — if you have the right cooling. It also features Intel's Turbo Boost Max 3.0 tech, which "steers" applications to the highest-performing core, ideally meaning that even those programs which don't know how to use multiple processor cores should be running faster.
Side note: You're probably used to seeing latin prefixes to denominate multi-core processors; an 8-core processor is octa-core, so why isn't the 10-core processor a deca-core? The answer is: That's how Intel officially calls it. We'll still call it deca-core in our hearts, though. 
So who needs this thing? In its press release, Intel points out that this would be the perfect processor for virtual reality gaming, be it on the Oculus Rift or the HTC Vive. Another application would be video editing, a task that typically makes good use of multiple processor cores. 
Unfortunately, the top processor in Intel's range comes at a very high price — $1,723. Add to that the cost of a high-end graphics card and other components, and you could easily be hitting $10,000 for a single PC. But hey, if you can afford it...






 if you find this post interesting , do share it on facebook and google plus.

I made iMovie magic on the iPad Pro

I made iMovie magic on the iPad Pro
For the last few months, I spent hours creating a video celebrating my quarter century of marriage. It’s honestly too personal to show you, but what I learned in the process is not.
You can make something quite awesome with an iPhone, iMovie (for the iPad) and an iPad Pro.
When I got married, we couldn’t afford a videographer. We ended up with beautiful photos and the memories we could replay in our own heads. For my project, I decided to create a wedding video out of available assets (our old wedding photos) and newly shot material.
I’ve been editing video on and off for decades, but I had never done something at this scale. There were a lot of moving parts and, honestly, a lot I didn’t know about what I would need and how I could actually edit it.
Over the years, I’ve used a bunch of different desktop video editing apps, most notably Adobe Premiere and Premiere Elements in Windows. The former is a complex and powerful tool for pros (read: not me). I used it, but found it daunting. Premiere Elements has the power of its namesake, but hides almost all the complexity. But there was good reason for me to not use it.
This project had to be done in secret. I couldn’t sit at our home desktop editing this video, not if I planned to surprise my wife. Likewise, I couldn’t edit it in the office when I was supposed to be working. I decided that I’d have to try and get the job done on mobile.

Editing and shooting on the go

When Apple introduced its 12.9-in. iPad Pro in 2015, I was immediately smitten. It had desktop power squeezed into a giant touchscreen. Plus, it could run my increasingly go-to, mobile video editing app iMovie.
I’d started using iMovie to edit quick video clips around the same time Apple introduced the iPhone 6s and 6s Plus. These two powerful phones could handle editing up to 4K video.
Yes, you can edit video on your tiny iPhone

Soon, I was attending events, grabbing a few clips of 1080p video with the iPhone and then editing them in the iPhone with iMovie before posting them on social media. I appreciated iMovie’s simplicity and speed. Video rendering on these (short) clips was fast and ensured I could still be timely with a tweeted bit of video news.
As the months passed, I grew confident in my iMovie abilities. Even so, I wasn’t entirely sure that iMovie on the iPad Pro could handle this project. There would be so many moving pieces; plus, I wondered if I could edit without a mouse — just by using my fingers and gestures? Still, there was no getting around the iPad's portability and my growing enthusiasm for iMovie. 
So I dove in.

Assets

As I conceptualized the project, I knew that I would need to shoot some of my own video on an iPhone 6s Plus. I actually used a tripod and tried to frame my shots the way a real documentarian might. I did not, though, have a boom mic, a fact that would later impact my editing process.
In addition to the video I shot, I also received clips over iMessage, via email and even some through Facebook Messenger. The quality and aspect ratios were all over the map. I downloaded all the files to the iPad Pro.
Virtually all the images you see here were photos I shot with an iPhone 6S Plus

Since I also planned to use a lot of wedding photos, I had to decide how to get them into the tablet. To save time, I simply photographed them with the iPhone 6s Plus. Since most were matte finished, this actually worked surprisingly well. The only thing I did not count on is that all my photos appeared in my Photo Stream and would show up on our HDTV every time we turned on Apple TV and it went into sleep mode. My wife noticed, but still had no idea what I was doing.
I transferred my assets onto the iPad Pro via AirDrop.

Inside iMovie

I’m convinced that even video neophytes can figure out iMovie.
It’s organized into three major areas: the playback window in the upper left. To the right of it is your asset pane — which you can collapse for more video-editing space — and below both of them is the timeline. In the iPhone, you only get playback and the timeline, with icon access to assets (photo, video and audio). I really came to appreciate all the space afforded by the giant 12.9 iPad Pro (I also used the smaller iPad Pro 9.7 to work on the project).
iMovie's layout is clear, simple and built for your fingers

Under the asset pane are icons for content types: video, photos and audio. You just select one to see the content stored on the iPad. Photos and video are sorted similarly, by All, Recently Added, Favorites. Photos also includes types like Panoramas and groups you’ve created.
The more you pre-organize your content, the better off you are. I did not do enough of that and was often scrolling through hundreds of images and videos to find the one I wanted. As for Audio, it helps to know what you are looking for since there is no Recently Added. Best to create a soundtrack playlist in iTunes and then you can find all your tracks grouped together in iMovie.

Photo story

As I noted earlier, I had no original wedding video, just the fresh interviews (and some b-roll) I captured in the weeks leading up to editing my project. That meant I’d be using photos – a lot of them – throughout the video. iMovie, though, has at least two great features for bringing still images to life. One is called the Ken Burns Effect. Yes, that Ken Burns.
I made iMovie magic on the iPad Pro

Fans of Burns’s epic documentary on The Civil War will recall how he breathed life into 130-year-old images by constantly moving the camera as voice overs and music played over them.
When you drag and drop photos onto your timeline, iMovie automatically adds a subtle Ken Burns effect, usually slowly zooming in on the photo. However, iMovie also lets you choose where to start and end the effect. I used this to particularly good effect on family shots, basically making the camera pan across a lineup of family members.

More documentary effects

The second part of my master plan to make a largely photo-based video interesting involved, naturally, audio.
While it can be fun to watch people sit around talking about a wedding that happened a quarter of a century ago, I realized I’d want to keep the talking head face time to a minimum and use their stories to drive some of the narrative behind the original images.
Fortunately, iMovie lets you detach the audio from your video clips. I used this throughout my project to add voice-overs to static photos and some b-roll. (You can also use detaching audio to delete audio from any video.)
You'll find some of iMovie's best features under the Actions Menu. It's where you can split video, detach audio and even duplicate clips

Initially, I would choose a clip and detach the audio from the whole thing. But I soon learned that the only way to really keep track of which audio went with which video clips was to confine my detached audio to as small a portion of the video as necessary.
What I’d do is split the video clip near where I wanted the voice-over portion to begin. Then I’d drop an image in next to the video clip and use my finger to trim the video back almost to the beginning of where I split it off. As I did this, the photo would slide over the audio track. I made sure to have the photo, which can play out Ken Burns effect for as long as you want, last as long as the detached audio. The result is that a subject would start speaking on camera and then the video would transition to a photo as they continued to talk over it. It’s a classic documentary look.
As long as I did not pick up the video with detached audio and move it around, iMovie did a good job of keeping the audio and video in sync.

Say what?

In my interviews, I noticed that people often switched subjects with barely a pause between sentences. I needed people talking about the same thing grouped together, so I had to make some pretty precise video cuts.
iMovie let me work with the clips broadly, by zooming out on the whole timeline. But if I needed to get precise, I pinched and zoomed as far as I could so I could find the exact cut point I needed in a 5 second clip. Throughout my project, I was constantly zooming in and out.
Adjusting audio volume in iMove is easy, but it's the fade controls that make it sound professional

I could also clean up some of the audio, where it sounded like I had abruptly cut people off, by adjusting the Trim and Fade on audio. I just had to select the audio clip, then hit the fade button.
On the actual audio clip, I could adjust the fade (going from full volume to none) as far as I wanted. I did it until people sounded like they had naturally stopped speaking, but not so much that I faded away what they were actually saying.

Don’t get ahead of yourself

For as good as iMovie is, I learned a hard lesson about halfway through my project. I was busy cutting clips, detaching audio and moving things around (I even tried to add a soundtrack, but more on that later) when I realized that I had somehow screwed up the video and audio synchronization. It was so bad that I could not figure out how to undo all that I had done to have the right audio playing with the right talking heads.
(A versioning feature would have been a huge help here, or even just the ability to save a project under a different name and then work on the original. This is one place where Windows video editing would have made things easier.)
So I started over, but with a better understanding of how I had to manage the project. I decided to not detach a single audio clip until I had almost every video clip and photo in position. It was like managing a complex magazine layout or collage. I gently placed all the pieces down without, in essence, applying any glue. After that I would work on creating the right voice-over segments, Ken Burnsing the photos and the very last thing I would do is add a soundtrack.

Like a melody

Even though iMovie has a track reserved for soundtrack or music, this is one of the app’s weakest areas. You have to add music sequentially; it is impossible to drop in a track at a specified place and pin it there. Eventually I figured out how to choose all my tracks and assign their order before putting them on the actual soundtrack.
As in other audio areas, I could adjust the volume, but I love that iMovie will automatically lowers the music volume when there’s an audio track in the foreground. This made it easier to manage sound levels, though I did have to do a lot of hand-adjusting on clip audio because all the video I'd been given was recorded at different levels and when you split audio from video, it changes the level, making it louder, yet again.
The good news is that you can easily add a soundtrack to your movie. The bad news is that this is one of iMovie's most basic features

For my soundtrack, I used a combination of the tracks that iMovie provides, which are pretty good, and music I bought myself. Since this was not a public/commercial project, I didn’t worry about royalty and copyright issues.

The in-betweeners

I also experimented a fair amount with iMovie’s built-in transitions. Having music behind everything meant I didn’t need as many transitions, but I still found areas where a dissolve worked well. The more complex themes and wipes a little too distracting for my video.
You can add some pretty sick-looking text transitions in iMovie

I also wanted something bigger to break up what I thought of as the chapters in my video. Eventually I settled upon dissolving into and out of a black screen with some text appearing to announce the next chapter topic. To make that video, I recorded on my iPhone as my hand covered the screen. Voilà! black video. I just needed to delete the audio and then add text.
I love iMovie’s title overlay tools, which I used a little on top of photos, but mostly for the chapters. I settled on one Title style and then used it for all the chapters. I also had the added benefit of using the iPad Pro's attachable Smart keyboard, which meant I didn't lose any screen real estate to a virtual keyboard.

The final cut

Ultimately, I spent weeks editing and producing a 20-minute video entirely in iMovie on an iPad Pro, mostly while commuting on a train. I learned a lot during the project, even if I didn't use all of the advance features in iMovie (I never employed split-screen or picture-in-picture video, for instance).  
iMovie could use a few changes, like something akin to a file system for project management, a better way of managing assets and a much smarter audio track system, but, overall, I am impressed. If you have a new iPad or even an iPad Pro and iMovie (which is free!), why wouldn’t you be doing all your video projects on the iPad?
As for my movie, I'm no Spielberg and Ken Burns would likely be appalled, but I'm happy with it. The quarter-century-late wedding video premiered to a tiny audience on Facebook. It made my wife cry. For me, that’s as good as an Academy Award.