Quantcast
Channel: StudioSysAdmins Message Board
Viewing all 3749 articles
Browse latest View live

Maya 2018 Central Install

$
0
0
Maya 2018 Central Install
posted by Mike Rochefort on March 26, 2018, 5:05 p.m. (2 days ago)
Hello!

As this is my first time doing this, I thought I'd reach out to all of you on best practices for how to install Maya to be used on a farm. I'd rather not have to install Maya to each blade, but if it's better to then so be it.

My first assumption is to just install Maya locally to one blade and then copy over the /usr/autodesk directory to my network share. As this will only be used for rendering, I'm not sure if I need the licensing RPM's included with the tar file. Or I could just rpm --prefix it to the share from the start.

Just looking for the best methodology here, any tips are appreciated!

Cheers,
Mike

Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
Hello!

As this is my first time doing this, I thought I'd reach out to all of you on best practices for how to install Maya to be used on a farm. I'd rather not have to install Maya to each blade, but if it's better to then so be it.

My first assumption is to just install Maya locally to one blade and then copy over the /usr/autodesk directory to my network share. As this will only be used for rendering, I'm not sure if I need the licensing RPM's included with the tar file. Or I could just rpm --prefix it to the share from the start.

Just looking for the best methodology here, any tips are appreciated!

Cheers,
Mike


Metadata controller options?

$
0
0
Metadata controller options?
posted by Perry Paolantonio on March 27, 2018, 10:29 a.m. (1 day ago)

We're a small studio (3 users, about a dozen workstations), that specializes in film scanning, color correction, restoration. We're dealing with very big files - a combo of large image sequences and large containerized files (ProRes). We have a 40GbE network that has been up and running for a few years, and all machines are connected over SMB. Each workstation has a large local RAID, so we've been able to limp along by reading stuff over the network from one shared local RAID, doing something with the files, and writing to that workstation's local RAID. But this has obvious limitations, so over the past few months, I built an iSCSI SAN. It has two 60TB hardware RAID6 pools, which we've divided up into 12-16TB iSCSI targets.

The hardware has been up and running for a while, but we're using it the way we were local drives: A workstation will mount an iSCSI target, and then use SMB to share that with others that may need occasional access. This kind of works, but what I want is to have multiple machines with read access on a target that one machine has read/write access on. SMB is a performance bottleneck, so I want to be able to connect directly over iSCSI.

So I am looking for a metadata controller for iSCSI on a 40GbE network, software only, that will work with our existing hardware installation. So far I've found a grand total of two options:

TigerStore (used to be MetaSAN) - about $1000/server but price is tied to storage size. More space = more costs, so we're probably looking at $2k to start, $3k by the end of the year. This seems an arbitrary way to price, and it rubs me the wrong way. Also, I haven't heard from them after trying to contact them a few weeks ago with questions, so that's not encouraging. 

iSANmp (Storage Network Solutions) - not a metadata server, but a peer-to-peer system that requires the software be installed on each workstation, at $200/seat. Probably in the $2500 range to set this up. Also, each machine uses a USB dongle, so adding new seats is a little bit of a pain.

*all* I want is a software solution. I don't need a turnkey system. We can build the server or use one of our existing servers for this purpose. I dont care if the metadata controller is running on Windows or Linux, but we do need support on the workstation side for Mac, Windows 7 and Linux (CentOS 6 and 7).

Is an inexpensive software-only solution to this problem a pipe dream? I'm hoping to make a list of stuff to look at at NAB so any suggestions are welcome.

Thanks!

Thread Tags:
  storage 

0 Responses   1 Plus One's   3 Comments  
 

We're a small studio (3 users, about a dozen workstations), that specializes in film scanning, color correction, restoration. We're dealing with very big files - a combo of large image sequences and large containerized files (ProRes). We have a 40GbE network that has been up and running for a few years, and all machines are connected over SMB. Each workstation has a large local RAID, so we've been able to limp along by reading stuff over the network from one shared local RAID, doing something with the files, and writing to that workstation's local RAID. But this has obvious limitations, so over the past few months, I built an iSCSI SAN. It has two 60TB hardware RAID6 pools, which we've divided up into 12-16TB iSCSI targets.

The hardware has been up and running for a while, but we're using it the way we were local drives: A workstation will mount an iSCSI target, and then use SMB to share that with others that may need occasional access. This kind of works, but what I want is to have multiple machines with read access on a target that one machine has read/write access on. SMB is a performance bottleneck, so I want to be able to connect directly over iSCSI.

So I am looking for a metadata controller for iSCSI on a 40GbE network, software only, that will work with our existing hardware installation. So far I've found a grand total of two options:

TigerStore (used to be MetaSAN) - about $1000/server but price is tied to storage size. More space = more costs, so we're probably looking at $2k to start, $3k by the end of the year. This seems an arbitrary way to price, and it rubs me the wrong way. Also, I haven't heard from them after trying to contact them a few weeks ago with questions, so that's not encouraging. 

iSANmp (Storage Network Solutions) - not a metadata server, but a peer-to-peer system that requires the software be installed on each workstation, at $200/seat. Probably in the $2500 range to set this up. Also, each machine uses a USB dongle, so adding new seats is a little bit of a pain.

*all* I want is a software solution. I don't need a turnkey system. We can build the server or use one of our existing servers for this purpose. I dont care if the metadata controller is running on Windows or Linux, but we do need support on the workstation side for Mac, Windows 7 and Linux (CentOS 6 and 7).

Is an inexpensive software-only solution to this problem a pipe dream? I'm hoping to make a list of stuff to look at at NAB so any suggestions are welcome.

Thanks!

Any other Winnipeg admins?

$
0
0
Any other Winnipeg admins?
posted by James Paskaruk on April 2, 2018, 1:34 p.m. (2 days ago)

Hey All,

Just discovered this site. Any other Winnipeg folks on the site? I feel sad being unable to periodically drink beer with other members due to geography.

Thread Tags:
  fun 

0 Responses   0 Plus One's   0 Comments  
 

Hey All,

Just discovered this site. Any other Winnipeg folks on the site? I feel sad being unable to periodically drink beer with other members due to geography.

Using Vray for Maya 2018 as a Module

$
0
0
Using Vray for Maya 2018 as a Module
posted by Mike Moss on April 9, 2018, 7:25 p.m. (2 days ago)

Hello,

I've contacted Chaos Group twice about this and they have not given me a response yet, so im hoping someone here might have an idea.

 

We are running Maya 2018.2 in our studio with V-Ray 3.6.03 as a local install on each machine (Windows 7 Pro). Id like to move away from that and have V-ray load from the network.

 

I've gone through the steps described here: https://docs.chaosgroup.com/display/VRAY3MAYA/Installation+of+V-Ray+for+Maya+from+a+.zip+File

and that does in fact work just fine. But instead of a bunch of a variables that i need to set in GPO, id rather utilize Maya Module files as much as possible.

 

We currently have a single module file that loads 3 other plugins: Redshift, SOuP and Ornatrix, without issue. I have started to dig through and managed to get V-ray to sucessfully load by appending the same module file. Everything seems to work. Vray loads, icons, plugins, tools, creation, all is good...

 

...Except...

 

...When you go to render the frame - it crashes.

 

// Error: V-Ray : ERROR: Could not create plugin SettingsRTEngine ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsRenderChannels ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsOutput ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsOptions ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsImageSampler ! //
// Error: V-Ray : ERROR: Could not create plugin FilterLanczos ! //
// Error: V-Ray : There was a fatal error building the scene for V-Ray. //

 

 

I have more info that i can share, regarding the env variables that are being used right now, how my module file is built, etc. But i figure i would keep it somewhat streamlined to start to see if i got any bites.

 

Thanks!
Mike
Thread Tags:
  variable maya vray module 

0 Responses   1 Plus One's   0 Comments  
 

Hello,

I've contacted Chaos Group twice about this and they have not given me a response yet, so im hoping someone here might have an idea.

 

We are running Maya 2018.2 in our studio with V-Ray 3.6.03 as a local install on each machine (Windows 7 Pro). Id like to move away from that and have V-ray load from the network.

 

I've gone through the steps described here: https://docs.chaosgroup.com/display/VRAY3MAYA/Installation+of+V-Ray+for+Maya+from+a+.zip+File

and that does in fact work just fine. But instead of a bunch of a variables that i need to set in GPO, id rather utilize Maya Module files as much as possible.

 

We currently have a single module file that loads 3 other plugins: Redshift, SOuP and Ornatrix, without issue. I have started to dig through and managed to get V-ray to sucessfully load by appending the same module file. Everything seems to work. Vray loads, icons, plugins, tools, creation, all is good...

 

...Except...

 

...When you go to render the frame - it crashes.

 

// Error: V-Ray : ERROR: Could not create plugin SettingsRTEngine ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsRenderChannels ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsOutput ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsOptions ! //
// Error: V-Ray : ERROR: Could not create plugin SettingsImageSampler ! //
// Error: V-Ray : ERROR: Could not create plugin FilterLanczos ! //
// Error: V-Ray : There was a fatal error building the scene for V-Ray. //

 

 

I have more info that i can share, regarding the env variables that are being used right now, how my module file is built, etc. But i figure i would keep it somewhat streamlined to start to see if i got any bites.

 

Thanks!
Mike

LTO4 now needed as well (will ship world wide at good price)

$
0
0
LTO4 now needed as well (will ship world wide at good price)
posted by Jorg-Ulrich Mohnen on April 10, 2018, 1:42 p.m. (1 day ago)

Team

We are searching for as many LTO4 new tapes as possible, preferably from SONY

SMS/text 310-951-7331 or email info@earthondrive.com (hundreds upon hundreds of TB's and we're stuck in thje middle of it for a geospatial wharehouse.

Thanks

Jorg

Thread Tags:
  LTO4 now needed as well 

0 Responses   0 Plus One's   0 Comments  
 

Team

We are searching for as many LTO4 new tapes as possible, preferably from SONY

SMS/text 310-951-7331 or email info@earthondrive.com (hundreds upon hundreds of TB's and we're stuck in thje middle of it for a geospatial wharehouse.

Thanks

Jorg

Multihost teradici over WAN?

$
0
0
Multihost teradici over WAN?
posted by Arturo Camacho on April 13, 2018, 2:25 p.m. (1 day ago)

We've been able to set up one remote zero client to connect to our router and set up forwarding of ports for this to work correctly to one of the machines in our LAN. 

Now there's the need to set up a second machine like this. Is there an option to change the default port the Teradici card/zero client uses?

If there isn't, i've tried setting up a VPN link, and that works, but it's super laggy due to latency, any recommended piece of hardware that might to a better job?

Thanks in advance.

Thread Tags:
  teradici, vpn, latency, wan 

0 Responses   0 Plus One's   0 Comments  
 

We've been able to set up one remote zero client to connect to our router and set up forwarding of ports for this to work correctly to one of the machines in our LAN. 

Now there's the need to set up a second machine like this. Is there an option to change the default port the Teradici card/zero client uses?

If there isn't, i've tried setting up a VPN link, and that works, but it's super laggy due to latency, any recommended piece of hardware that might to a better job?

Thanks in advance.

Rolling TV stand

$
0
0
Rolling TV stand
posted by Sean Macrae on April 17, 2018, 1:10 p.m. (4 days ago)

Need a rolling TV stand to borrow/rent.

 

Anyone have one in the GTA?

 

Shark out!

 

 


0 Responses   0 Plus One's   0 Comments  
 

Need a rolling TV stand to borrow/rent.

 

Anyone have one in the GTA?

 

Shark out!

 

 

Single vs. Dual CPU Xeon Gold?

$
0
0
Single vs. Dual CPU Xeon Gold?
posted by  on April 18, 2018, 12:15 p.m. (3 days ago)
I'm looking at the new HP Z8 workstations, and was wondering what experience people are having comparing Single vs. Dual CPU machines.

Are you better off going with Dual CPU (More L3 Cache, more bandwidth) vs. single CPU (less handshaking with other CPU).

In particular, I'm focusing on this for running the usual Nuke and Maya (with Vray) jobs, so I wasnt planning to get a crazy number of cores, so possibly looking at either:

Dual Xeon Gold 6128 (3.4 Ghz 6 Core).
vs.
Single CPU Xeon 6136 12 core (3Ghz 12 core)

The single core is very slightly slower (not sure its noticeable), but quite a bit cheaper.

I'm deliberately going with a high clock speed, since we sadly live in a world where far too much is not multi-threaded, and 12 cores seems like a pretty good balance, or am I missing something?

Sam.



Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
I'm looking at the new HP Z8 workstations, and was wondering what experience people are having comparing Single vs. Dual CPU machines.

Are you better off going with Dual CPU (More L3 Cache, more bandwidth) vs. single CPU (less handshaking with other CPU).

In particular, I'm focusing on this for running the usual Nuke and Maya (with Vray) jobs, so I wasnt planning to get a crazy number of cores, so possibly looking at either:

Dual Xeon Gold 6128 (3.4 Ghz 6 Core).
vs.
Single CPU Xeon 6136 12 core (3Ghz 12 core)

The single core is very slightly slower (not sure its noticeable), but quite a bit cheaper.

I'm deliberately going with a high clock speed, since we sadly live in a world where far too much is not multi-threaded, and 12 cores seems like a pretty good balance, or am I missing something?

Sam.




Midtown Toronto screening of "Ready Player One" Apr 24th 2pm

$
0
0
Midtown Toronto screening of "Ready Player One" Apr 24th 2pm
posted by Tom Burns on April 18, 2018, 1:08 p.m. (3 days ago)

Dell EMC Isilon and Dell Monitors would like to invite you to a screeninig of Ready Player One at the Yonge & Eglinton Cineplex at 2pm on Tues 24th.

Please register here: https://dell.captix.com/event/ReadyPlayerOne/register

Enjoy the show, and the swag! 

@TVBurns

Thread Tags:
  fun 

0 Responses   0 Plus One's   0 Comments  
 

Dell EMC Isilon and Dell Monitors would like to invite you to a screeninig of Ready Player One at the Yonge & Eglinton Cineplex at 2pm on Tues 24th.

Please register here: https://dell.captix.com/event/ReadyPlayerOne/register

Enjoy the show, and the swag! 

@TVBurns

Disable Maya.env Files??

$
0
0
Disable Maya.env Files??
posted by Mike Moss on April 18, 2018, 1:27 p.m. (3 days ago)

Hi,

I just sent an email to Autodesk asking the same question, but though i might get a faster response here.

 

Do to our internal pipeline, etc the Maya.env files that are located in the "C:\Users\<USER>\Documents\maya\2018\maya.env" location cause problems for us. Usually if they are empty or do not exist then its fine, but obviously, if the file is deleted, then Maya re-creates it on next start up. Its sporadic, and at times, users decide to enter their own data in there - things that should not be, and wrecks havoc on our pipeline. We do not use this file AT ALL - so i was curious if there is a way to force Maya to ignore that file. We use other Windows Env Variables to enable/disable other features and settings. Does Maya have a Windows Env Variable that i can push out through GPO to disable maya.env file all together?

For example, right now via GPO i have this varible pushed out to all workstations:

 

MAYA_ENABLE_LEGACY_VIEWPORT = 1

 

Is there something i can do similarly to disable maya.env files? Something like:

 

MAYA_DISABLE_ENV = 1

 

Thanks,

Mike

 

Thread Tags:
  maya 

0 Responses   0 Plus One's   0 Comments  
 

Hi,

I just sent an email to Autodesk asking the same question, but though i might get a faster response here.

 

Do to our internal pipeline, etc the Maya.env files that are located in the "C:\Users\<USER>\Documents\maya\2018\maya.env" location cause problems for us. Usually if they are empty or do not exist then its fine, but obviously, if the file is deleted, then Maya re-creates it on next start up. Its sporadic, and at times, users decide to enter their own data in there - things that should not be, and wrecks havoc on our pipeline. We do not use this file AT ALL - so i was curious if there is a way to force Maya to ignore that file. We use other Windows Env Variables to enable/disable other features and settings. Does Maya have a Windows Env Variable that i can push out through GPO to disable maya.env file all together?

For example, right now via GPO i have this varible pushed out to all workstations:

 

MAYA_ENABLE_LEGACY_VIEWPORT = 1

 

Is there something i can do similarly to disable maya.env files? Something like:

 

MAYA_DISABLE_ENV = 1

 

Thanks,

Mike

 

windows 2012 NFS export question

$
0
0
windows 2012 NFS export question
posted by Greg Whynott on April 19, 2018, 1:50 p.m. (2 days ago)
I have a windows 2012 server running Veeam and wish to export the local array of it over NFS. I've installed NFS services can can export and access C: from linux machines.

But when I try to export the drive I wish to use, it doesn't show up as an option. Anyone know why this might be?

I've turned normal sharing on it, renamed it, changed its drive letter and rebooted in-between each iteration of. None of those helped.


thanks if you know of a solution.
g$

Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
I have a windows 2012 server running Veeam and wish to export the local array of it over NFS. I've installed NFS services can can export and access C: from linux machines.

But when I try to export the drive I wish to use, it doesn't show up as an option. Anyone know why this might be?

I've turned normal sharing on it, renamed it, changed its drive letter and rebooted in-between each iteration of. None of those helped.


thanks if you know of a solution.
g$

High Quality Sample Footage - Uncompressed

$
0
0
High Quality Sample Footage - Uncompressed
posted by Devin Termini on April 21, 2018, 7:36 p.m. (1 day ago)

Hello,

I am working on a project to do some encoder quality benchmarking. To get the best results I'd like to start with the highest quality footage possible. Does anyone know where I can get uncompressed UHD (3840x2160) video? The uncompressed footage I can get is actually derived from some other compressed source such as ProRes, DNxHR, etc... For the most accurate test results, I'm thinking I'd like to avoid this.

Ideally this would be 10-bit and 4:2:2 color, Rec. 709 color space. They don't have to be long, perhaps a handful of 10 second clips would do the job.

I'm also considering using other footage, such as JPEG2000, Sony XAVC, or RED R3D as sources. These are more readily available. However, I'm uncertain if the compression in the source material will impact test results. Any input is welcome!

Thanks in advance!

Thread Tags:
  social 

0 Responses   0 Plus One's   0 Comments  
 

Hello,

I am working on a project to do some encoder quality benchmarking. To get the best results I'd like to start with the highest quality footage possible. Does anyone know where I can get uncompressed UHD (3840x2160) video? The uncompressed footage I can get is actually derived from some other compressed source such as ProRes, DNxHR, etc... For the most accurate test results, I'm thinking I'd like to avoid this.

Ideally this would be 10-bit and 4:2:2 color, Rec. 709 color space. They don't have to be long, perhaps a handful of 10 second clips would do the job.

I'm also considering using other footage, such as JPEG2000, Sony XAVC, or RED R3D as sources. These are more readily available. However, I'm uncertain if the compression in the source material will impact test results. Any input is welcome!

Thanks in advance!

Playback and Review

$
0
0
Playback and Review
posted by Michael Oliver on April 24, 2018, 3:05 p.m. (1 day ago)
Reviving an old topic. Curious what people are currently using for 4k playback/review systems on hardware and software front.

Seems like lots of evos in an icydock, RV / Resolve / Nuke studio / Scratch / Baselight, Blackmagic / AJA, and 10Gb has been the standard for a while now.

Anyone using zfs on blackback machines to take advantage of tiering or are you just using raid cards with a bunch of ssd?

Seems like this space has been stagnant w/ regards to innovation. I am hoping for something new to play with.

Years ago Daryl Strauss had the Framethrower which provided a pretty good boot to review interface. I suppose I could script something to launch RV in a "console mode" on boot but wanted to see if there were any other solutions out there to:

- provide a controlled environment for review / playback of 4k HDR material with minimal ability for users to change settings and muck things up
- HDR, 4:4:4 color accurate playback
- Allow material to be remotely primed to fast local storage for instant review
- Bonus for allowing direct feedback during the review back into an asset management system (I know RV has this)

Anything new / turnkey / OSS out there that has not hit my radar?

--
Michael Oliver
mcoliver@gmail.com

Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
Reviving an old topic. Curious what people are currently using for 4k playback/review systems on hardware and software front.

Seems like lots of evos in an icydock, RV / Resolve / Nuke studio / Scratch / Baselight, Blackmagic / AJA, and 10Gb has been the standard for a while now.

Anyone using zfs on blackback machines to take advantage of tiering or are you just using raid cards with a bunch of ssd?

Seems like this space has been stagnant w/ regards to innovation. I am hoping for something new to play with.

Years ago Daryl Strauss had the Framethrower which provided a pretty good boot to review interface. I suppose I could script something to launch RV in a "console mode" on boot but wanted to see if there were any other solutions out there to:

- provide a controlled environment for review / playback of 4k HDR material with minimal ability for users to change settings and muck things up
- HDR, 4:4:4 color accurate playback
- Allow material to be remotely primed to fast local storage for instant review
- Bonus for allowing direct feedback during the review back into an asset management system (I know RV has this)

Anything new / turnkey / OSS out there that has not hit my radar?

--
Michael Oliver
mcoliver@gmail.com

Legacy Sun parts (Sun Fire V440)

$
0
0
Legacy Sun parts (Sun Fire V440)
posted by Bill Heiden on April 25, 2018, 5:19 p.m. (3 days ago)

 

 Wondering if any one knows of any company that sells legacy Sun parts?  It appears that I need a LSI1030 SCSI controller.  Would also welcome a reference to a company that specializes in older Sun equipment; Los Angeles based is a plus.

Thread Tags:
  sun fire v440 oracle 

0 Responses   0 Plus One's   0 Comments  
 

 

 Wondering if any one knows of any company that sells legacy Sun parts?  It appears that I need a LSI1030 SCSI controller.  Would also welcome a reference to a company that specializes in older Sun equipment; Los Angeles based is a plus.

Mac Os / Chrome - open browse dialog for upload in a locked path

$
0
0
Mac Os / Chrome - open browse dialog for upload in a locked path
posted by Tamas Pataki on April 26, 2018, 8:26 a.m. (2 days ago)

Hi,

Using Mac, and Chrome browser, when I'm on a page where I would need to upload a file (let say attache to an email) it will always open the folder I have used the last time.

Is it possible to change this behaviour and when I open this dialog alway start from the desktop or any pre-set location?I want to force the user to find the path manually every time, even if is the same location, to ensure it is not mistaken with other folders in case they have the same name.

I know the download behave differently and that I can change in the application as default download path.

Do you aware if this can be done? I'm not even sure if is OS or application specific.

 

Thanks

 

Tamas


0 Responses   0 Plus One's   0 Comments  
 

Hi,

Using Mac, and Chrome browser, when I'm on a page where I would need to upload a file (let say attache to an email) it will always open the folder I have used the last time.

Is it possible to change this behaviour and when I open this dialog alway start from the desktop or any pre-set location?I want to force the user to find the path manually every time, even if is the same location, to ensure it is not mistaken with other folders in case they have the same name.

I know the download behave differently and that I can change in the application as default download path.

Do you aware if this can be done? I'm not even sure if is OS or application specific.

 

Thanks

 

Tamas


Mac Os / Chrome - open browse dialog for upload in a locked path

$
0
0
Mac Os / Chrome - open browse dialog for upload in a locked path
posted by Greg Whynott on April 27, 2018, 6:55 p.m. (1 day ago)
Not sure without looking at it, but there may be a 'delete browsing history' on close option, that might be a shot term fix.

or look at the preference file, may be a 'remember last " thingy... should be \users\g.whynott\appdata\local\google\chrome\user data\default or similar

I'll take a look when i'm near a mac.

greg


On Thu, Apr 26, 2018 at 8:26 AM, Tamas Pataki <content@studiosysadmins.com> wrote:

Hi,

Using Mac, and Chrome browser, when I'm on a page where I would need to upload a file (let say attache to an email) it will always open the folder I have used the last time.

Is it possible to change this behaviour and when I open this dialog alway start from the desktop or any pre-set location?I want to force the user to find the path manually every time, even if is the same location, to ensure it is not mistaken with other folders in case they have the same name.

I know the download behave differently and that I can change in the application as default download path.

Do you aware if this can be done? I'm not even sure if is OS or application specific.

Thanks

Tamas


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe

Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
Not sure without looking at it, but there may be a 'delete browsing history' on close option, that might be a shot term fix.

or look at the preference file, may be a 'remember last " thingy... should be \users\g.whynott\appdata\local\google\chrome\user data\default or similar

I'll take a look when i'm near a mac.

greg


On Thu, Apr 26, 2018 at 8:26 AM, Tamas Pataki <content@studiosysadmins.com> wrote:

Hi,

Using Mac, and Chrome browser, when I'm on a page where I would need to upload a file (let say attache to an email) it will always open the folder I have used the last time.

Is it possible to change this behaviour and when I open this dialog alway start from the desktop or any pre-set location?I want to force the user to find the path manually every time, even if is the same location, to ensure it is not mistaken with other folders in case they have the same name.

I know the download behave differently and that I can change in the application as default download path.

Do you aware if this can be done? I'm not even sure if is OS or application specific.

Thanks

Tamas


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe

License Metric Names in Graphite

$
0
0
License Metric Names in Graphite
posted by Jesse Kretschmer on April 30, 2018, 11:15 a.m.
Hello all,
Does anyone have recommendations on license metricnames in graphite?

I'm planning to do something like:
licenses.<server>.<vendor>.<feature>.issued
licenses.<server>.<vendor>.<feature>.inuse

The setup would work fine for basic information, but I've been using Prometheus and OpenTSDB for a while and I have gotten used to tags. I was also tagging user and host in prometheus which gave some really great usage reports. vendor and feature were also treated as tags, so aggregating and grouping was pretty simple.

We have a stable graphite setup which is better than my single-docker-instance prometheus node. I would rather participate in our current system than continue running my own.

Cheers,
Jesse




Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
Hello all,
Does anyone have recommendations on license metricnames in graphite?

I'm planning to do something like:
licenses.<server>.<vendor>.<feature>.issued
licenses.<server>.<vendor>.<feature>.inuse

The setup would work fine for basic information, but I've been using Prometheus and OpenTSDB for a while and I have gotten used to tags. I was also tagging user and host in prometheus which gave some really great usage reports. vendor and feature were also treated as tags, so aggregating and grouping was pretty simple.

We have a stable graphite setup which is better than my single-docker-instance prometheus node. I would rather participate in our current system than continue running my own.

Cheers,
Jesse




files copied via nfs mount are unreadable on windows

$
0
0
files copied via nfs mount are unreadable on windows
posted by Kym Watts on May 1, 2018, 12:40 p.m.

Hello,

 

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

 

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

 

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
    - this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
    - the copy file have BAD permissions on it.

 

 

Can some one point me in the right direction please?

 

Cheers

Kym

Thread Tags:
  nfs permissions 

0 Responses   0 Plus One's   0 Comments  
 

Hello,

 

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

 

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

 

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
    - this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
    - the copy file have BAD permissions on it.

 

 

Can some one point me in the right direction please?

 

Cheers

Kym

files copied via nfs mount are unreadable on windows

$
0
0
files copied via nfs mount are unreadable on windows
posted by Todd Smith on May 1, 2018, 1:10 p.m.
Hi Kym,

Can you show us what the permissions are?

Thanks,
Todd Smith
Head of Information Technology

soho vfx 
40 Hanna Ave. Suite 403, Toronto, Ontario M6K 0C3

----- On May 1, 2018, at 12:40 PM, Kym Watts <content@studiosysadmins.com> wrote:

Hello,

 

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

 

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

 

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
    - this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
    - the copy file have BAD permissions on it.

 

 

Can some one point me in the right direction please?

 

Cheers

Kym


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe
Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
Hi Kym,

Can you show us what the permissions are?

Thanks,
Todd Smith
Head of Information Technology

soho vfx 
40 Hanna Ave. Suite 403, Toronto, Ontario M6K 0C3

----- On May 1, 2018, at 12:40 PM, Kym Watts <content@studiosysadmins.com> wrote:

Hello,

 

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

 

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

 

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
    - this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
    - the copy file have BAD permissions on it.

 

 

Can some one point me in the right direction please?

 

Cheers

Kym


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe

files copied via nfs mount are unreadable on windows

$
0
0
files copied via nfs mount are unreadable on windows
posted by Jean-Francois Panisset on May 1, 2018, 2:05 p.m.
What's the NAS you are using, is it just a Linux box serving out SMB via Samba, meaning that the underlying filesystem being served via NFS and SMB is a POSIX filesystem, or is this some kind of enterprise NAS system? Also what is the source of account information, Active Directory? LDAP?

A recipe that should kind of work with a Linux / Samba NAS is:

- Active Directory as the source of account identity
- RFC 2307 fields populated in Active Directory to provide consistent POSIX UIDs and GIDs for user and group objects
- Bind all Linux machines to AD using SSSD
- Configure Samba to map between Windows SIDs and UNIX UID/GIDs using AD / RFC 2307
- Simple NFS v3 access (never had much luck trying to use NFS v3 ACLs to try to "match" Windows permission model)

JF



On Tue, May 1, 2018 at 9:40 AM, Kym Watts <content@studiosysadmins.com> wrote:

Hello,

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
- this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
- the copy file have BAD permissions on it.

Can some one point me in the right direction please?

Cheers

Kym


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe

Thread Tags:
  discuss-at-studiosysadmins 

0 Responses   0 Plus One's   0 Comments  
 
What's the NAS you are using, is it just a Linux box serving out SMB via Samba, meaning that the underlying filesystem being served via NFS and SMB is a POSIX filesystem, or is this some kind of enterprise NAS system? Also what is the source of account information, Active Directory? LDAP?

A recipe that should kind of work with a Linux / Samba NAS is:

- Active Directory as the source of account identity
- RFC 2307 fields populated in Active Directory to provide consistent POSIX UIDs and GIDs for user and group objects
- Bind all Linux machines to AD using SSSD
- Configure Samba to map between Windows SIDs and UNIX UID/GIDs using AD / RFC 2307
- Simple NFS v3 access (never had much luck trying to use NFS v3 ACLs to try to "match" Windows permission model)

JF



On Tue, May 1, 2018 at 9:40 AM, Kym Watts <content@studiosysadmins.com> wrote:

Hello,

Forgive me for possible posting this, i think this was discussed a few years ago, but i cant for the life of me find it.

we are getting a problem where files copied via an nfs mount point on linux are un readable on windows by the same user.

in a nut shell the problem looks like this:

When our render user on linux , creates a file directly on the nfs mount it is readable by the windows render user.

When our render user on linux creates a file directly on the nfs mount, then copies / renames the file on the nfs mount , the windows user cannot read the file anymore.

simplist linux test to recreate the problem:

create a file with the command: 'touch test.txt'
- this file will have the correct permissions on it.

copy the file with the command: 'cp testfile.txt teste_copy.txt'
- the copy file have BAD permissions on it.

Can some one point me in the right direction please?

Cheers

Kym


To unsubscribe from the list send a blank e-mail to mailto:studiosysadmins-discuss-request@studiosysadmins.com?subject=unsubscribe

Viewing all 3749 articles
Browse latest View live




Latest Images