Friday 29 September 2017

Germany is the biggest source of botnets in Europe

Just a year after the Mirai malware infected connected devices to create the first IoT botnet, new research from Norton shows that global botnets have continued to grow and spread as a result of unaware users inadvertently infecting others. According to Norton, the top three countries responsible for hosting the highest number of bots in Europe are Germany at just over eight percent, Italy at 10 percent and Russia at almost 14 percent. The UK was Europe's 11th highest source of bot infections -- which is down from 2015 when it was ranked seventh. In terms of specific cities, Madrid in Spain… [Continue Reading]


Read the full article here by Betanews

Wednesday 27 September 2017

Kibana 5 internals - Dashboards, Visualizations and Index Patterns

Kibana is one of the Elastic products, part of the Elastic stack (formerly known as ELK).

Kibana is a an open source web app written with Angular and running over a thin Node.js server (which acts as a webserver).

Kibana is able to connect to a single Elasticsearch node.
It's possible to put a load balancer to several nodes, but the best practices suggest a Coordinating or a Tribe (deprecated) or a Cross-Cluster node (in case you need it).

Keep in mind: Kibana has no actual reason to exist without an Elasticsearch cluster to connect to.
The opposite is not true: Elasticsearch can live on its own and be quite useful without Kibana and can be even connected to Grafana if you really need visualizations.

Kibana has very little local state/persistence: the configuration files and some cache files. It completely relies on Elasticsearch for most of the features (e.g. authentication if you own X-Pack) and for storing all the data required to run.

Kibana provides a cool interface to build dashboards.

Kibana 5.0 dashboard screenshot from Elastic website
The dashboards are made up of several visualizations.

A visualization can be something static (a Markdown field to show some comment or help message) or dynamic (a pie chart, a table or a histogram).

The dynamic visualizations gets generated from the data retrieved from Elasticsearch.
The query to Elasticsearch is normally generated behind the scenes thanks to the UI. It's possible even to write your own query and use the output in the visualization.
The queries targeting Elasticsearch are most of the time aggregations.

A visualization typically refers to data fields exposed by an a index pattern.

What is an index pattern? Let's talk about index templates first.

If you are familiar with the Elastic stack, you've probably heard about index templates.
Index templates are one of the key parts of the Elasticsearch configuration.
An index template tells how the data you send into an index(or indices having an index name pattern) of an Elasticsearch cluster:

  • should be analyzed
    • if a text must be tokenized for free text search or taken as it is
    • if a float should be maybe mapped into a scaled float to save some space
    • if a field shouldn't be indexed at all
    • ...
  • should be stored and distributed
    • number of shards
    • number of replicas
    • how many times the index gets refreshed
    • if you want to keep the original document or not
    • ...

An index template is not able on its own to enumerate all the fields present in an index.
The only way to get them would be to perform a GET on an index, but you cannot define how Kibana should consider it.

Index patterns to the rescue!

An index pattern tells Kibana the fields you can query and what is their type.
Other advantages are:

  • it's able to target multiple indices (but you could do that with aliases on Elasticsearch)
  • you can define scripted fields

It's important to trigger a refresh of the index pattern if fields in the targeted index (or indices) have changed (added, removed or change of type occurred).

What happens if you want to export a single Dashboard?

You have several options:

  • the Elasticsearch API allows to backup all Kibana state, typically stored in the .kibana index
    • All dashboards, visualizations, index patterns and the dynamic configuration will be saved
  • Play with the new experimental Kibana import/export API (available since 5.5.0, it will be ready for prime time on 6.0)
  • Write a script in your preferred language

The latter option implies your tool:
  • gets the dashboard from the .kibana index, dashboard type by title
  • gets the panelsJSON field
  • unmarshalls the json data
  • gets all the visualization ids
  • exports all the visualization ids getting them from the .kibana index, visualization type
  • on each visualization
    • gets the kibanaSavedObjectMeta field
    • unmarshalls the json value
    • gets the index field within the query
  • get the index pattern from the .kibana index, index-pattern type
    • scripted fields are stored within an index-pattern

Unfortunately, there's no way to ensure this will not break in the near future.
Several changes will come with Elastic stack 6.0 and 7.0, such as:
  • Elasticsearch index mapping types will disappear
  • Kibana will have multi tenancy?
  • Internal storage changes on Kibana (can occurr even on minor versions), as the API is not public

Elastic has written a post on Kibana internals on 2016, worth taking a look.

There's no silver bullet solution to move your dashboard out from your cluster in a future-proof manner.

Hope you find this post useful!

Bye!

Tuesday 26 September 2017

Password-theft 0day imperils users of High Sierra and earlier macOS versions

reader comments 1

There's a vulnerability in High Sierra and earlier versions of macOS that allows rogue applications to steal plaintext passwords stored in the Mac keychain, a security researcher said Monday. That's the same day the widely anticipated update was released.

The Mac keychain is a digital vault of sorts that stores passwords and cryptographic keys. Apple engineers have designed it so that installed applications can't access its contents without the user entering a master password. A weakness in the keychain, however, allows rogue apps to steal every plaintext password it stores with no password required. Patrick Wardle, a former National Security Agency hacker who now works for security firm Synack, posted a video demonstration here.

The video shows a Mac virtual machine running High Sierra as it installs an app. Once the app is installed, the video shows an attacker on a remote server running the Netcat networking utility. When the attacker clicks "exfil keychain" button, the app surreptitiously exfiltrates all the passwords stored in the keychain and uploads them to the server. The theft requires no user interaction beyond the initial installation of the rogue app, and neither the app nor macOS provides any warning or seeks permission.

An Apple representative e-mailed the following statement:

macOS is designed to be secure by default, and Gatekeeper warns users against installing unsigned apps, like the one shown in this proof of concept, and prevents them from launching the app without explicit approval. We encourage users to download software only from trusted sources like the Mac App Store and to pay careful attention to security dialogs that macOS presents.

Continually disappointed

By default, Gatekeeper prevents Mac users from installing apps unless they're digitally signed by developers. While the app in the video is unsigned—and as a result can't be installed on a default Mac installation—the vulnerability can be exploited by signed apps as well. All that's required to digitally sign an app is a membership in the Apple Developer Program, which costs $99 per year. Wardle reported the vulnerability to Apple last month and decided to make the disclosure public when the company released High Sierra without fixing it first.

"As a passionate Mac user, I'm continually disappointed in the security of macOS," Wardle told Ars. "I don't mean that to be taken personally by anybody at Apple—but every time I look at macOS the wrong way something falls over. I felt that users should be aware of the risks that are out there."

Wardle said Apple would be served well by implementing a bug bounty program for macOS. Last year, the company established a bounty program that pays as much as $200,000 for security bugs in iOS that runs iPhones and iPads. Apple has declined to pay researchers for private reports of security flaws in macOS. Earlier this month, Wardle published details of a second unfixed bug in High Sierra.



Read the full article here by Ars Technica

MongoDB’s Mongo Moment

As if files for an IPO, open source database vendor MongoDB has rich opportunities, yet also faces its share of challenge

Read the full article here by Datamation.com

Monday 25 September 2017

ARM TrustZone Hacked By Abusing Power Management

"This is brilliant and terrifying in equal measure," writes the Morning Paper. Long-time Slashdot reader phantomfive writes: Many CPUs these days have DVFS (Dynamic Voltage and Frequency Scaling), which allows the CPU's clockspeed and voltage to vary dynamically depending on whether the CPU is idling or not. By turning the voltage up and down with one thread, researchers were able to flip bits in another thread. By flipping bits when the second thread was verifying the TrustZone key, the researchers were granted permission. If number 'A' is a product of two large prime numbers, you can flip a few bits in 'A' to get a number that is a product of many smaller numbers, and more easily factorable. "As the first work to show the security ramifications of energy management mechanisms," the researchers reported at Usenix, "we urge the community to re-examine these security-oblivious designs."
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Sunday 24 September 2017

Android's 'Check For Update' button now works and actually checks for OTA updates!

Updating Android can be a painful experience depending on the handset you have and the carrier you're with. You might hear that an update is available, but find nothing shows up when you hit the Check For Update button. Now this should be changing... and not just for Oreo users. Many Android users will have experienced the frustration of knowing full well that there is an update available for their device, but it's not offered up automatically. Even more annoyingly, it's often the case that even when performing a manual check, updates remain unavailable for download. Google has announced that:… [Continue Reading]


Read the full article here by Betanews

Saturday 23 September 2017

Facebook Relents, Switches React, Flow, Immuable.js and Jest To MIT License

An anonymous reader quotes the Register: Faced with growing dissatisfaction about licensing requirements for some of its open-source projects, Facebook said it will move React, Jest, Flow, and Immutable.js under the MIT license next week. "We're relicensing these projects because React is the foundation of a broad ecosystem of open source software for the web, and we don't want to hold back forward progress for nontechnical reasons," said Facebook engineering director Adam Wolff in a blog post on Friday. Wolff said while Facebook continues to believe its BSD + Patents license has benefits, "we acknowledge that we failed to decisively convince this community"... Wolff said the updated licensing scheme will arrive next week with the launch of React 16, a rewrite of the library designed for more efficient operation at scale. Facebook was facing strong criticism from the Apache Software Foundation and last week Wordpress.com had announced plans to move away from React. "Wolff said Facebook considered a license change for its other open-source projects, but wasn't ready to commit to anything," the Register adds. "Some projects, he said, will keep the BSD + Patents license."
Share on Google+

Read more of this story at Slashdot.



Read the full article here by Slashdot

Thursday 21 September 2017

Lumberyard: Building a Better Engine

Amazon decided to give a preview for the months ahead and share some features to be added to Lumberyard.

The focus for the next few releases is to make Lumberyard easier, more powerful, and more modular. The team is working hard to deliver new systems and features that align with these goals, and your feedback has played a crucial role in that process. 

A lot has changed since the engine was first launched: they’ve replaced over 60% of the original codebase, switching out older, redundant systems (e.g. CryEntity) for more modern, performant ones (e.g. Component entity systems)—and this will continue to be the case. While some new systems are still in preview, the team is working to provide a stable foundation for users’ current and future games, so you can build and innovate confidently moving forward. You can also expect more detailed tutorials and documentation to support these systems in the months to come.

So what exactly are these new systems and features? Here’s a glimpse of what you can expect in the next couple of releases:

  • Script Canvas – Script Canvas, the new visual scripting language, will provide a high performance, flexible scripting language in a familiar node interface, so content creators can build gameplay with little to no programming experience. Built entirely from the ground up to be compatible with Lumberyard’s component entity and Behavior Context system, Script Canvas enables teams to easily use any combination of visual scripting, Lua, or C++ coding in their game project. Script Canvas will replace Flow Graph.
  • Brand new animation system – Siggraph attendees got a sneak peek at new, robust animation solution, which was built from technology used by well-known publishers such as EA, Ubisoft, among others. The goal here is simple: help animators build amazing characters in Lumberyard with little to no engineering help. This new system will replace the existing CryAnim functionality, including Geppetto and Mannequin, and include functionality such as a visual state machine, support for linear skinning, joint scaling, and much more.
  • CryEngine Legacy Deprecation – In addition to streamlining the editor experience, the team will soon hide the legacy functionality to better signal to customers which systems are forward-looking. This effort will help developers migrate from legacy entity systems to the new component entity system, and will include a data converter tool for developers still using some of the older CryEntity systems. A significant number of legacy system data will be auto-converted to the new component model in the next few releases—all in an effort to remove CryEntity systems from Lumberyard early next year. 
  • More Cloud Gems and AWS Integration – soon, you’ll see a Cloud Gem that helps capture in-game surveys from your players, as well as a gem that leverages the power of Amazon Lex and Amazon Polly to build speech recognition, text-to-speech, and conversational gameplay experiences. From there, the roadmap considers new gems that reduce or automate engineering effort to build common connected and social features (e.g. push notifications, metrics, monetization tools, dynamic content, etc.), accelerate production (e.g. asset processing), and enable new player experiences. 
  • Component Entity Workflows – they will continue to improve the component entity workflows, especially in the areas around usability and interoperability with the Entity Inspector, Entity Outliner, and viewport. These improvements also include better support for working on large scale levels with lots of content, improved entity organization and manipulation capabilities, and better slice manipulation and collaboration tools – working towards the eventual ability to open and edit a slice that is not part of a level. 
  • Location-independent Game Project – they plan on delivering Project Configurator changes and build system improvements that enable customers to store their game and gems in any arbitrary location. This has been a popular request from the community.
  • Mobile Performance and Optimization – they are also improving mobile workflows and performance. Their efforts will continue to improve frame rate on minimum spec devices (iPhone 5S+ and Android Nexus 5 and equivalents), improve battery life usage for applications, and reduce memory and app size footprint on iOS and Android (currently at ~25MB minimum size, but we’ll continue to work to make it smaller).
  • Memory Management and Thread Unification – they have two on-going initiatives underway to improve runtime performance (especially for PC and console platforms) as well as stability. First off, they will unify and optimize Lumberyard’s memory management. The teams are focused on identifying and resolving existing memory leaks while improving the memory usage patterns throughout the engine. Second, they also plan on migrating the engine’s various threading mechanisms to the AZCore job system, enabling further threading performance improvements and load balancing.
  • New Shader and Material System – the short term objectives are to improve the usability of the material and shader system by converting sub-materials into individual material assets, enabling the concept of a shared material library, and letting a developer author a new surface shader without compiling C++. Longer term, they’re planning a full refactor and modernization of the material and shader system, but they’re going to spend the time to get this right, and this work will go into next year.
  • macOS – the vision has always been to provide a complete, cross-platform experience, so they are also providing the tools necessary for developing Lumberyard games on macOS. This includes the Lumberyard editor and all its associated tools, the asset processor, the shader compiler and, of course, the renderer. Mac support has been a popular request from the customers, especially the ones building iOS games.

© a.sergeev for 80lvl, 2017. | Permalink | One comment | Add to del.icio.us
Post tags: , ,

Feed enhanced by Better Feed from Ozh



Read the full article here by 80lvl

Google is buying HTC's Pixel team for $1.1 billion

After weeks (months, and years) of speculation, HTC has announced that its "Powered by HTC" R&D division -- the team behind Google's Pixel and Pixel XL smartphones -- will be purchased by Google for $1.1 billion in cash. According to HTC's CFO Peter Shen, this will mean about half -- yes, half -- of the 4,000 people in his company's R&D team will be joining Google, but he emphasized that HTC will continue developing its own range of smartphones, including its next flagship product. The agreement also grants Google a non-exclusive license for a large part of HTC's intellectual property. The deal is expected to be approved and closed by early 2018.

Curious about what all of this means? You could do worse than to check out our guide to the subject from last week.

"This agreement is a brilliant next step in our longstanding partnership, enabling Google to supercharge their hardware business while ensuring continued innovation within our HTC smartphone and Vive virtual reality businesses," HTC co-founder and CEO Cher Wang said in a statement.

The rumor mill went into overdrive yesterday after HTC announced that trading of its shares on the Taiwan Stock Exchange would be halted today pending a "major announcement." The company swiftly added that, to debunk sale rumors, that it did not "comment on market rumor or speculation."

By then, however, most of everyone had assumed that the long-standing flirtation between the two companies would finally finish. Unsubstantiated reports on Twitter claimed that the deal would see HTC's manufacturing division become a part of the search engine, but the reality is that half of its R&D division will be joining Google instead. According to Google SVP of Hardware Rick Osterloh, the two parties have yet to set a new work location for these employees, but they will aim to bring minimal disruption to them. The remaining R&D team will focus on HTC's own smartphone brand as well as VR technology.

In return, Google "will continue to have access to HTC's IP to support the Pixel smartphone family," according to HTC's statement. Or in Osterloh's own words, it's "continuing our big bet on hardware," which is fitting given his involvement with Google's short-lived ownership of Motorola's smartphone business.

Much like the deal that cleaved Nokia's hardware business from its parent company, the HTC name and brand will live on but in both the smartphone and VR worlds.

Daniel Cooper contributed to this article.

Source: Google, HTC



Read the full article here by Engadget

From Small to Complex: Level Art in a Box

Alexander Sychov talked about the way he creates exquisite environment art with cardboard boxes in Unreal. This scene is actually available for download at Unreal Marketplace.

Introduction

Hello, everyone! My name is Alexander Sychov, and I’m a 3D Artist. At the moment, I’m wrapping up my contract as a Senior 3D Artist at Deep Silver Fishlabs in Hamburg, Germany and preparing to transition to a Level Artist position at Ubisoft, Toronto.

I have worked in the game industry since 2006, honing my craft at companies like Crytek Ukraine, Nikitova and Ulysses Games. Some games I’ve worked on are Warface, Crysis 2 & 3, Ryse and others.

I began teaching myself 3D art when I was around 15. As I remember it, my parents bought a used PC that had 3D Studio Max installed on it, and that was my first step towards my dream of making games. From there, I started making mini-games with my friends and animating short amateur video clips with my brothers (who are now professional animators). These passion projects were my art school.

How it started

It all started with an idea that I’ve had for a long time. I’ve always been interested in and eager to learn Unreal Engine, and I knew that the best way to learn would be to dive right in. I decided to create a modular warehouse environment. I had a pretty smooth start at the beginning due to the user-friendly UI, readable documentation, and the volume of super helpful free tutorials on youtube. Pre-production and learning the basics for the Engine were simultaneous and ongoing throughout the project.

Workflow

My workflow is not anything revolutionary – I use a familiar, conventional approach to level art.

Everything goes from small to big, from primitive to complex. The beginning always starts on paper. In this case, I used Trello to structure and break down the level into stages (block out, rough low-poly, low-poly + UV, high-poly + baking, texturing, asset polishing, level beautification – pipeline might vary in some cases), and smaller tasks, like gathering inspirational materials, videos, screenshots, etc. Enough time was spent on collecting references and putting them together into one library on Pinterest. Also PureRef is perfect to store all essential refs on the one huge sheet.

I always have PureRef opened on top of all windows during blocking out, modeling, texturing, level dressing, or whatever- always. It helps me keep attention to the details, scale, and to get the right immersion, especially when combined with relevant music.

Music is another topic to narrate. I am a huge Half-Life geek, and while I worked, I listened to the OST so that I could imagine the “universe” in which this warehouse might exist, although I listen to different music genres depending on my mood and asset complexity- if it is something technical, I change from rapid music to calm, chill, ambient music to stay focused. If I do fast and creative work, I change my music to Drum and Bass to speed up the workflow to keep me charged up.

Once I have a list of major architecture and prop assets, particularly for this kind of environment, I start modeling in 3DS Max. Since I planned to create a set that would be modular and flexible, there was no need to start doing a basement in the engine. Also important is that I always have reference objects like a human scale body in my scene to have quick, accurate visual measurements of the object I’m modeling. Sometimes I use boxes sized to 1, 2, or 3 meters just so I have a sense of scale.

Once major block out assets are complete, I check them live in the Engine and snap them together. I do this to verify modularity and to ensure that everything snaps properly. This creates little milestones that help me see the bigger picture over time because I can see how the pieces start fitting together. Additionally, I can establish a first lighting pass in order to make the proper mood in the level.

Cardboard boxes

I wanted to be as efficient and as fast as possible to save time, but try to minimize visual fidelity and overall look. It’s a challenge, but I found that it’s possible by working smarter! In some rare cases, it was still a bit tricky to achieve the desired result.

Instead of creating standard high-poly models with support edges from the low-poly model, I tried to modify slightly low-poly mesh (with some extra rounds on the corners where needed) and then put the mesh through ZBrush to apply Dynamesh and polish it with the tools in Deformation scroll.

This workflow worked out pretty well on most of the props (barrels, crates, racks, boxes and other stuff), because I literally didn’t waste time on preparing high-poly that much and if I needed I could always add some wear details in ZBrush. Yes, hard-surface and no support edges on high-poly models!

In the end, I had decent bevels on the normal map.

Creating a diffuse texture was a bit of a process. I admit that I was thinking of using simple cardboard textures from the internet, but after a few tries, I gave up on that idea. Who uses photo textures these days anyway? Not me, thanks. I decided to find some basic free substance materials on the internet and then improve them to my particular specifications. I definitely made the right decision, since this gave me possibilities to control dirtiness, alter the internal structure of the edges of the cardboard, and explore other procedural options. Procedural materials for everyone! Hell yeah!

The next problem to solve as how to get many variations of the boxes into the level. Unfortunately, I didn’t create a blueprint to magically place all the stickers, notes, and shortcuts- I did it by hand during level dressing pass. I prepared a few assembled prefabs of boxes in advance, and after making an extra beautification pass on the level, I tried to tell the warehouse’s story by making all the visible areas look unique and according to my references. In a perfect world, I would have made a blueprint that automates the small details on the boxes. That would’ve been the better way to do it, but I did not do this due to time constraints. Next time, I’ll try to make some blueprints for sure!

Style

The main direction were real-life references of warehouses that I collected at the beginning of the project. One thing I really liked in the references was having this super clean and tidy warehouse floor and surrounding environment, so my goal was to have it “like new”, but with some slightly visible wear and tear.

Also, I really liked to create bright color combinations to make it pop, particularly yellow and blue, which I used ended up using in the entry facility interior. What really helps to make the picture balanced is to use opposing colors and brightness differentials to distinguish the floor, ceilings, and walls. This visual trick is very handy to scale up the attractiveness of the level, and gives the player a good overview in the level by preventing them from being lost in a mass of warehouse details. Aside from the color scheme, I used light and properly arranged assets to enhance the aesthetics of the environment.

Substance Designer & Painter

At the same time I was learning UE4, I was also exploring Substance products like Designer, which I wanted to use to create several essential materials like polished concrete floor, and different metals for the warehouse’s walls and ceilings. (They can be downloaded here for free).

I spent the most time creating the floor because this is one of the most visible parts in the warehouse theme. I had to get used to the new Substance Designer workflow, which also took up some time. I knew the power of Designer, but couldn’t use it at full range due to my lack of experience with it, but I think after chipping away at it, I was able to get the results I wanted. And again, it really helped to have references on top of the screen at all times, catching my eyes and reminding me what I wanted the warehouse to look like.

I made most of the textures for the props and architecture assets in Substance Painter. Basically, for all crates, barrels, buckets, and other related props, I took a default plastic material from the library and created a custom Smart material where I stored a few levels of dirtiness and wear such that when I applied it to the asset, I could select which type of effect I wanted (low dust, heavy dust; plastic scratches, paint dots). As you may have noticed, some of the plastic objects are not completely new, but rather variations of small details, like wear.

One thing I decided to have on hand while texturing props in Substance Painter is a set of alpha masks with different text variations. The collection grew as the project went on. These were generic texts used to “simulate” the imaginary production and use for the assets- I wanted to bring as much realism and authenticity to the assets as possible.

Project Materials

Being relatively new to UE4, I’d say the master materials set up I created in the materials editor were not as complex and unique as I would have wished for. However, there is one thing I implemented almost everywhere, which was to change the color by using grayscale masks and a correlating colormask (RGB and black mask in the alpha channel) if the object had different material types. I am really happy how it worked out! After having the proper set up, I prepared a set of instance materials with different colors for each asset so that during level dressing I could change it to any color I wanted on the fly.

This is how the basic master material looks:

There are a few materials that different slightly (props, architecture, decals, signage, etc). I made adjustments to the roughness parameters to create create glossier or rougher asset materials.

I found this very helpful in designing different areas of the level, especially making the floor surfaces “wet” and “clean”.

Lighting

Since there is no directional source of light, I decided to make the key lighting mostly by using Spot Lights placed above every ceiling lamp. The focus was mainly on creating “artificial” lighting look. Furthermore, I didn’t hesitate to use simple point lights (static, without shadows) to fake illumination in dark corners where I noticed low visibility.

To create contrast, I employed different environment artist trickery. Specifically, for the closed passage area, I used “warm” lighting just to make it look different from the other “colder” locations. In addition, most of the surrounding props were changed to a similar color pallet (yellow racks, for instance), along with weathered warehouse wall materials to create significant visual patterning.

Another trick I used was to accurately highlight details throughout the levels, like signage, numbers and letterings by using the same spot lights.

I had to fiddle with the SkyLight actor, which had a minor, but still visible impact on the scene. I added an Atmospheric Fog element to have the dusty sort of look, which helped to underscore the “emptiness” of the negative spaces between the numerous prop clusters.

Challenges

In this project, one of the challenges for me was to manage modularity and the interconnection between the assets. It was all I could do to keep myself from adding even more walls, rooms, and corridors.

Another challenge was to adjust the baked lighting in the level, which I feel I probably could have spent way more time perfecting.

Because only right now I finally got familiar how baked lighting works and how to manage baking time more efficiently. For instance, merging some modular parts on the level like walls into combined groups (entire ceiling may be as one mesh) will reduce time for baking lightmaps significantly. I spent a bit of time on the Unreal forums getting answers for lightmap baking but I highly recommend taking a look at it if you have any questions yourself. 

Even when I reached the end of the project, that is, with all the requirements satisfied, I was left with a feeling of “incompleteness”. Like all artists, I knew that I could polish the it endlessly, but I thought it better to be disciplined as save some creative (and physical energy) for the next environment.

The goal I set for myself was to learn how to apply my years of environment design to Unreal. With project planning, the right inspirations, and a healthy dollop of elbow grease, I succeeded in making the warehouse set modular, customizable, and usable in UE4. I am grateful and proud to be able to share the product, as well as the process by which it was created, with you. I hope that by reading this article, you might be inspired to capitalize on your potential too.

If you’re interested in seeing more of my work, you can check out my artstation.

If you want to use my assets in your upcoming products, or if you just want the best view of the asset in a custom scene, you can find them available in the marketplace.

As a final note, I’d like to give a quick thank you to friends and colleagues that have help me with their constructive feedback, inspiration, and encouragement, including my fellow 3D artists Denis Novikov, Denis Rutkovsky, and Cyrill Vitkovskiy, and game designer Phillip Chan.

Alexander Sychov, Senior 3D Artist at Deep Silver FISHLABS

Interview conducted by Kirill Tokarev

Follow 80.lv on FacebookTwitter and Instagram


© a.sergeev for 80lvl, 2017. | Permalink | No comment | Add to del.icio.us
Post tags: , , , , , , , , , , ,

Feed enhanced by Better Feed from Ozh



Read the full article here by 80lvl