Apr 16 04:24:03.885127 ip-10-0-133-81 systemd[1]: Starting Kubernetes Kubelet... Apr 16 04:24:04.362744 ip-10-0-133-81 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.362744 ip-10-0-133-81 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 04:24:04.362744 ip-10-0-133-81 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.362744 ip-10-0-133-81 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 04:24:04.362744 ip-10-0-133-81 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 04:24:04.364513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.364410 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 04:24:04.367004 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.366987 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.367004 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367004 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367008 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367012 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367015 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367018 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367021 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367024 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367026 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367029 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367032 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367035 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367038 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367041 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367043 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367046 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367049 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367051 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367054 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367057 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367059 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.367071 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367062 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367064 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367067 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367070 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367076 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367080 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367083 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367086 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367088 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367091 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367094 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367096 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367099 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367102 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367105 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367107 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367110 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367113 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367116 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.367590 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367118 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367121 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367123 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367126 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367129 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367131 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367134 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367143 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367145 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367148 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367152 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367156 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367159 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367161 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367164 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367167 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367170 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367172 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367175 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367178 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.368070 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367180 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367183 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367187 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367191 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367194 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367197 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367200 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367203 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367205 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367208 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367211 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367215 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367218 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367221 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367223 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367232 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367235 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367237 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367240 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367242 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.368557 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367245 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367248 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367251 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367253 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367256 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367258 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367725 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367732 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367735 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367737 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367740 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367743 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367745 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367748 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367751 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367753 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367756 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367759 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367761 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367764 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.369064 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367766 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367769 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367771 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367774 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367777 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367779 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367781 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367791 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367795 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367797 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367800 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367802 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367805 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367808 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367811 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367813 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367816 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367818 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367821 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367839 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.369552 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367842 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367844 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367847 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367850 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367853 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367855 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367858 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367861 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367864 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367867 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367870 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367872 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367875 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367877 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367880 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367883 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367885 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367888 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367890 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367893 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.370114 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367901 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367904 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367907 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367909 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367912 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367914 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367917 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367919 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367921 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367924 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367927 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367930 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367933 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367936 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367938 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367941 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367943 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367946 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367948 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.370614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367951 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367954 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367956 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367959 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367967 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367971 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367974 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367977 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367980 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367982 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367985 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367987 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.367990 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368074 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368092 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368101 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368106 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368111 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368114 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368118 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368123 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 04:24:04.371099 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368127 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368130 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368134 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368138 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368141 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368144 2567 flags.go:64] FLAG: --cgroup-root="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368147 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368151 2567 flags.go:64] FLAG: --client-ca-file="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368154 2567 flags.go:64] FLAG: --cloud-config="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368157 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368161 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368168 2567 flags.go:64] FLAG: --cluster-domain="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368171 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368174 2567 flags.go:64] FLAG: --config-dir="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368177 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368181 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368185 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368188 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368191 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368194 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368197 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368200 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368204 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368207 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368209 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 04:24:04.371607 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368214 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368225 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368228 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368231 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368234 2567 flags.go:64] FLAG: --enable-server="true" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368237 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368247 2567 flags.go:64] FLAG: --event-burst="100" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368250 2567 flags.go:64] FLAG: --event-qps="50" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368253 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368256 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368259 2567 flags.go:64] FLAG: --eviction-hard="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368263 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368266 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368270 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368273 2567 flags.go:64] FLAG: --eviction-soft="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368276 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368279 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368282 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368285 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368287 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368290 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368293 2567 flags.go:64] FLAG: --feature-gates="" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368297 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368300 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368303 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 04:24:04.372230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368307 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368310 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368313 2567 flags.go:64] FLAG: --help="false" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368316 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368319 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368322 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368325 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368328 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368332 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368340 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368344 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368346 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368349 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368357 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368361 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368364 2567 flags.go:64] FLAG: --kube-reserved="" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368367 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368370 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368373 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368376 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368379 2567 flags.go:64] FLAG: --lock-file="" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368382 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368385 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368388 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 04:24:04.372856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368394 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368397 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368400 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368403 2567 flags.go:64] FLAG: --logging-format="text" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368406 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368409 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368412 2567 flags.go:64] FLAG: --manifest-url="" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368415 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368419 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368422 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368426 2567 flags.go:64] FLAG: --max-pods="110" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368429 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368432 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368435 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368438 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368441 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368444 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368447 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368461 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368464 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368467 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368472 2567 flags.go:64] FLAG: --pod-cidr="" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368475 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 04:24:04.373452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368480 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368483 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368486 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368489 2567 flags.go:64] FLAG: --port="10250" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368492 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368495 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05fe44204fb2d5b4b" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368499 2567 flags.go:64] FLAG: --qos-reserved="" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368502 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368505 2567 flags.go:64] FLAG: --register-node="true" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368508 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368511 2567 flags.go:64] FLAG: --register-with-taints="" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368514 2567 flags.go:64] FLAG: --registry-burst="10" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368517 2567 flags.go:64] FLAG: --registry-qps="5" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368520 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368523 2567 flags.go:64] FLAG: --reserved-memory="" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368526 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368529 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368532 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368535 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368538 2567 flags.go:64] FLAG: --runonce="false" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368541 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368543 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368546 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368549 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368552 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368555 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 04:24:04.374047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368558 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368562 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368571 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368575 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368579 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368582 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368585 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368588 2567 flags.go:64] FLAG: --system-cgroups="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368591 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368596 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368599 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368602 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368608 2567 flags.go:64] FLAG: --tls-min-version="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368611 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368614 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368617 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368620 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368623 2567 flags.go:64] FLAG: --v="2" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368627 2567 flags.go:64] FLAG: --version="false" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368631 2567 flags.go:64] FLAG: --vmodule="" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368635 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.368638 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368753 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368756 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368760 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.374653 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368763 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368765 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368768 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368771 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368774 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368777 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368780 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368782 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368785 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368788 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368791 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368795 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368798 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368801 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368804 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368806 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368811 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368814 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368816 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368819 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.375332 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368822 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368837 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368841 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368843 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368846 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368849 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368851 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368854 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368857 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368859 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368862 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368864 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368867 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368870 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368872 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368875 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368879 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368883 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368886 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.375952 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368889 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368891 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368894 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368897 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368902 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368905 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368909 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368912 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368915 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368919 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368921 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368924 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368926 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368929 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368932 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368934 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368937 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368939 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368942 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.376429 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368944 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368947 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368950 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368952 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368955 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368957 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368960 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368963 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368966 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368968 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368971 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368973 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368976 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368978 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368981 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368983 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368986 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368991 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368994 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.368997 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.376912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.369000 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.369002 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.369006 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.369009 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.369011 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.369812 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.377051 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.377069 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377139 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377155 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377159 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377162 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377165 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377168 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377172 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377176 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.377421 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377180 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377184 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377187 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377190 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377193 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377195 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377198 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377200 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377203 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377206 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377208 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377211 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377214 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377216 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377220 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377223 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377225 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377228 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377230 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377233 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.377889 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377235 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377238 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377241 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377244 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377247 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377249 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377252 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377255 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377257 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377260 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377262 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377265 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377267 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377270 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377273 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377275 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377278 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377281 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377283 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377286 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.378379 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377289 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377292 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377294 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377297 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377301 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377305 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377308 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377311 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377314 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377317 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377319 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377322 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377325 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377328 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377331 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377333 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377336 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377338 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377341 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.378931 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377344 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377349 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377352 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377354 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377357 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377360 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377363 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377365 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377368 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377370 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377373 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377376 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377378 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377381 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377384 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377387 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377390 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377392 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.379400 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377395 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.377400 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377503 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377509 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377512 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377515 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377517 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377520 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377523 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377525 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377528 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377531 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377533 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377536 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377538 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377541 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 04:24:04.379858 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377543 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377546 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377548 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377551 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377553 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377556 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377559 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377561 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377564 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377567 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377569 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377572 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377575 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377578 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377580 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377583 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377585 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377588 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377590 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377593 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 04:24:04.380257 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377596 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377599 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377601 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377604 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377606 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377609 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377612 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377615 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377617 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377620 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377622 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377625 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377627 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377630 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377633 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377635 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377638 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377640 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377643 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377645 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 04:24:04.380768 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377648 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377651 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377653 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377656 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377659 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377662 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377665 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377667 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377670 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377672 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377675 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377679 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377682 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377686 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377689 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377691 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377694 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377697 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377700 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 04:24:04.381307 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377703 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377705 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377707 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377710 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377712 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377715 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377717 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377721 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377724 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377726 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377729 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377732 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:04.377734 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.377739 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.378540 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 04:24:04.381779 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.380491 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 04:24:04.382312 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.381979 2567 server.go:1019] "Starting client certificate rotation" Apr 16 04:24:04.382312 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.382084 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:04.382974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.382962 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 04:24:04.408564 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.408539 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:04.412182 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.412149 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 04:24:04.429543 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.429516 2567 log.go:25] "Validated CRI v1 runtime API" Apr 16 04:24:04.435258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.435236 2567 log.go:25] "Validated CRI v1 image API" Apr 16 04:24:04.438188 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.438168 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 04:24:04.439072 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.439049 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:04.442556 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.442528 2567 fs.go:135] Filesystem UUIDs: map[141ca108-f11e-4457-aaef-27107d84bd68:/dev/nvme0n1p4 31ab18f5-1c67-498a-a60c-105b45957b28:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 04:24:04.442636 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.442554 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 04:24:04.447758 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.447631 2567 manager.go:217] Machine: {Timestamp:2026-04-16 04:24:04.446460068 +0000 UTC m=+0.431910123 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099889 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ec1d5fee10e00c3f37387160d4575 SystemUUID:ec2ec1d5-fee1-0e00-c3f3-7387160d4575 BootID:680527ef-d6ec-49f0-9f48-badd2c3ef8d7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8d:2b:fc:cf:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8d:2b:fc:cf:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:36:e3:f3:71:e0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 04:24:04.447758 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.447745 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 04:24:04.447952 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.447935 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 04:24:04.449543 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449512 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 04:24:04.449712 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449545 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-81.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 04:24:04.449789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449727 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 04:24:04.449789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449740 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 04:24:04.449789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449766 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:04.449924 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.449790 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 04:24:04.450776 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.450762 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:04.450924 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.450913 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 04:24:04.453490 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.453478 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 16 04:24:04.453551 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.453497 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 04:24:04.454240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.454228 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 04:24:04.454300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.454248 2567 kubelet.go:397] "Adding apiserver pod source" Apr 16 04:24:04.454300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.454262 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 04:24:04.455322 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.455309 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:04.455397 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.455332 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 04:24:04.458552 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.458532 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 04:24:04.460217 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.460204 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 04:24:04.461744 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461729 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 04:24:04.461787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461754 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 04:24:04.461787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461764 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 04:24:04.461787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461774 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 04:24:04.461787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461782 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461791 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461801 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461809 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461820 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461847 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461870 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 04:24:04.461918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.461882 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 04:24:04.463880 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.463866 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 04:24:04.463926 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.463885 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 04:24:04.466493 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.466469 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 04:24:04.466566 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.466467 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 04:24:04.467819 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.467806 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 04:24:04.467883 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.467862 2567 server.go:1295] "Started kubelet" Apr 16 04:24:04.467991 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.467929 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 04:24:04.468509 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.468432 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 04:24:04.468725 ip-10-0-133-81 systemd[1]: Started Kubernetes Kubelet. Apr 16 04:24:04.472698 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.472508 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 04:24:04.474721 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.474143 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 04:24:04.474875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.474854 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-81.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 04:24:04.475493 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.475475 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 16 04:24:04.476490 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.476463 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hmjcs" Apr 16 04:24:04.480207 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.480193 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:04.480782 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.480766 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 04:24:04.480880 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.480783 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 04:24:04.481319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481305 2567 factory.go:55] Registering systemd factory Apr 16 04:24:04.481319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481320 2567 factory.go:223] Registration of the systemd container factory successfully Apr 16 04:24:04.481434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481401 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 04:24:04.481434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481416 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 04:24:04.481519 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481439 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 04:24:04.481519 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481493 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 16 04:24:04.481519 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481501 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 16 04:24:04.481649 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481567 2567 factory.go:153] Registering CRI-O factory Apr 16 04:24:04.481649 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481577 2567 factory.go:223] Registration of the crio container factory successfully Apr 16 04:24:04.481736 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481657 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 04:24:04.481736 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481681 2567 factory.go:103] Registering Raw factory Apr 16 04:24:04.481736 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.481694 2567 manager.go:1196] Started watching for new ooms in manager Apr 16 04:24:04.481944 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.481923 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.482512 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.482499 2567 manager.go:319] Starting recovery of all containers Apr 16 04:24:04.483664 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.483641 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hmjcs" Apr 16 04:24:04.486621 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.486563 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 04:24:04.486901 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.486870 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 04:24:04.487964 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.486523 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-81.ec2.internal.18a6bbaa9b4140b9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-81.ec2.internal,UID:ip-10-0-133-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-81.ec2.internal,},FirstTimestamp:2026-04-16 04:24:04.467818681 +0000 UTC m=+0.453268736,LastTimestamp:2026-04-16 04:24:04.467818681 +0000 UTC m=+0.453268736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-81.ec2.internal,}" Apr 16 04:24:04.493111 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.493096 2567 manager.go:324] Recovery completed Apr 16 04:24:04.497789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.497775 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.500354 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500339 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.500425 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500369 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.500425 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500382 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.500974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500962 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 04:24:04.500974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500975 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 04:24:04.501070 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.500989 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 16 04:24:04.503147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.503132 2567 policy_none.go:49] "None policy: Start" Apr 16 04:24:04.503147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.503148 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 04:24:04.503246 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.503158 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 16 04:24:04.540485 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.540459 2567 manager.go:341] "Starting Device Plugin manager" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.540557 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.540571 2567 server.go:85] "Starting device plugin registration server" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.540860 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.540873 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.540968 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.541056 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.541066 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.541653 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 04:24:04.557764 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.541690 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.641229 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.641128 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.642170 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.642154 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.642238 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.642186 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.642238 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.642196 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.642238 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.642225 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.651251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.651233 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.651297 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.651257 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-81.ec2.internal\": node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.667818 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.667784 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 04:24:04.669028 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.669010 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 04:24:04.669151 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.669038 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 04:24:04.669151 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.669058 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 04:24:04.669151 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.669064 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 04:24:04.669151 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.669098 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 04:24:04.671948 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.671932 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:04.674443 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.674428 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.769424 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.769389 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal"] Apr 16 04:24:04.769531 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.769477 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.771101 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.771084 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.771190 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.771120 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.771190 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.771134 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.772340 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.772324 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.772475 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.772461 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.772525 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.772494 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.773176 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773163 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.773258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773164 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.773258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773194 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.773258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773208 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.773258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773210 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.773258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.773226 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.774966 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.774949 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.775042 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.774986 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 04:24:04.775042 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.774950 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.775855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.775820 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientMemory" Apr 16 04:24:04.775934 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.775870 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 04:24:04.775934 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.775885 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeHasSufficientPID" Apr 16 04:24:04.783070 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.783054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.783133 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.783079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.783133 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.783096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/558b1f9822c72f9ef387e865da1b3b63-config\") pod \"kube-apiserver-proxy-ip-10-0-133-81.ec2.internal\" (UID: \"558b1f9822c72f9ef387e865da1b3b63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.805801 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.805778 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-81.ec2.internal\" not found" node="ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.810345 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.810324 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-81.ec2.internal\" not found" node="ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.875559 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.875524 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:04.883953 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.883915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/558b1f9822c72f9ef387e865da1b3b63-config\") pod \"kube-apiserver-proxy-ip-10-0-133-81.ec2.internal\" (UID: \"558b1f9822c72f9ef387e865da1b3b63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.883953 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.883958 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.884076 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.883977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.884076 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.884015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.884076 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.884020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/558b1f9822c72f9ef387e865da1b3b63-config\") pod \"kube-apiserver-proxy-ip-10-0-133-81.ec2.internal\" (UID: \"558b1f9822c72f9ef387e865da1b3b63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.884076 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:04.884037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4d62c969dcba3db9916e6f511688c3bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal\" (UID: \"4d62c969dcba3db9916e6f511688c3bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:04.976391 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:04.976325 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.076842 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.076797 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.109242 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.109218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:05.113112 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.113091 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:05.177626 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.177585 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.278155 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.278086 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.378715 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.378678 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.381860 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.381847 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 04:24:05.381998 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.381983 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 04:24:05.479370 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.479342 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.480838 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.480811 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 04:24:05.485991 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.485965 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 04:19:04 +0000 UTC" deadline="2027-12-30 17:14:59.95580605 +0000 UTC" Apr 16 04:24:05.486067 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.485992 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14964h50m54.469817432s" Apr 16 04:24:05.492972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.492953 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 04:24:05.515298 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.515270 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jvgj2" Apr 16 04:24:05.519430 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.519414 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jvgj2" Apr 16 04:24:05.580188 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.580155 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.606352 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.606327 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:05.673451 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:05.673417 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod558b1f9822c72f9ef387e865da1b3b63.slice/crio-54da9ed2cfcdbe47c14172d857c11974dda7439356208bc69c6bbf65548721be WatchSource:0}: Error finding container 54da9ed2cfcdbe47c14172d857c11974dda7439356208bc69c6bbf65548721be: Status 404 returned error can't find the container with id 54da9ed2cfcdbe47c14172d857c11974dda7439356208bc69c6bbf65548721be Apr 16 04:24:05.673798 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:05.673775 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d62c969dcba3db9916e6f511688c3bd.slice/crio-0d609c9c14562eaa0473b434f852f4b1a84461fcf0132f3e2b80c955038b1919 WatchSource:0}: Error finding container 0d609c9c14562eaa0473b434f852f4b1a84461fcf0132f3e2b80c955038b1919: Status 404 returned error can't find the container with id 0d609c9c14562eaa0473b434f852f4b1a84461fcf0132f3e2b80c955038b1919 Apr 16 04:24:05.680279 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.680255 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.681120 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.681104 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:24:05.780437 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:05.780391 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-81.ec2.internal\" not found" Apr 16 04:24:05.791097 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.791068 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:05.830113 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.830044 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:05.881298 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.881256 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" Apr 16 04:24:05.896145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.896113 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:05.897094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.897080 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" Apr 16 04:24:05.904925 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:05.904908 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 04:24:06.264054 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.263822 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 04:24:06.455865 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.455817 2567 apiserver.go:52] "Watching apiserver" Apr 16 04:24:06.465502 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.464884 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 04:24:06.466189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.466098 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-28th9","openshift-cluster-node-tuning-operator/tuned-qvg6g","openshift-dns/node-resolver-7pjb8","openshift-image-registry/node-ca-tns7g","openshift-multus/multus-additional-cni-plugins-6wjzz","openshift-multus/multus-n4qkb","openshift-network-operator/iptables-alerter-xgrdb","kube-system/konnectivity-agent-x8979","kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal","openshift-multus/network-metrics-daemon-j6hlh","openshift-network-diagnostics/network-check-target-b2zb5","openshift-ovn-kubernetes/ovnkube-node-s79zz"] Apr 16 04:24:06.468352 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.468269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.469335 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.469315 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:06.469455 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.469389 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:06.470381 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.470356 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.471229 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.471084 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.471686 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.471420 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.471686 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.471540 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n882s\"" Apr 16 04:24:06.471686 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.471614 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 04:24:06.473021 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.472432 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.474082 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.473846 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.474082 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.473913 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.474082 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.473854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rg4l9\"" Apr 16 04:24:06.474641 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.474621 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.474982 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.474755 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.475343 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.475323 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kjlp\"" Apr 16 04:24:06.475447 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.475355 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.475590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.475571 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 04:24:06.475862 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.475842 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.477173 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.477129 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.477314 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.477297 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.477789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.477528 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bhlg2\"" Apr 16 04:24:06.478324 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478035 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.478324 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478116 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 04:24:06.478324 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478200 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 04:24:06.478324 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478231 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mt6hr\"" Apr 16 04:24:06.478324 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478240 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 04:24:06.478595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.478350 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 04:24:06.479277 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.479259 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.479693 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.479649 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 04:24:06.480336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t579d\"" Apr 16 04:24:06.480629 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480610 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 04:24:06.480714 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.480784 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480722 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 04:24:06.480855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480792 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.480912 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.480611 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 04:24:06.481170 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.481134 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.481698 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.481629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qxqfm\"" Apr 16 04:24:06.481698 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.481673 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 04:24:06.481880 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.481750 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 04:24:06.482094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.482014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.482094 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.482081 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:06.483084 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.483062 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.483286 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.483267 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.483350 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.483322 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9r2xw\"" Apr 16 04:24:06.483891 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.483700 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.485608 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.485480 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.485608 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.485553 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:06.487073 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.487053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 04:24:06.487419 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.487377 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 04:24:06.487787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.487767 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 04:24:06.487890 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.487807 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2l45v\"" Apr 16 04:24:06.490653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.490623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-var-lib-kubelet\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.490788 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.490774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.490917 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.490902 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-systemd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.491034 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491018 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.491137 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491124 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-multus\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.491250 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491226 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-node-log\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.491361 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-script-lib\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.491361 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491286 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-modprobe-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.491361 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-conf\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.491361 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491350 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0423948b-a820-4490-bb94-3810fdef3c06-host-slash\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.491569 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.491379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:06.492078 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-os-release\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.492234 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98bbb6ac-6605-4f70-9681-333eea1951c2-agent-certs\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.492288 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492259 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98bbb6ac-6605-4f70-9681-333eea1951c2-konnectivity-ca\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.492332 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492308 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-etc-tuned\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.492385 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-tmp\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c35154e3-77e2-4f41-b5e8-99905ca385f9-hosts-file\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-k8s-cni-cncf-io\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-etc-kubernetes\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovn-node-metrics-cert\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlx8d\" (UniqueName: \"kubernetes.io/projected/fbe3cd6d-315c-4d44-81a3-217be3d98348-kube-api-access-zlx8d\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-kubernetes\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.492969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-var-lib-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f41140f0-dc31-4907-aae1-f8d108bb517f-host\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxbj2\" (UniqueName: \"kubernetes.io/projected/f41140f0-dc31-4907-aae1-f8d108bb517f-kube-api-access-cxbj2\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-sys\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493115 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0423948b-a820-4490-bb94-3810fdef3c06-iptables-alerter-script\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-system-cni-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.493351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493222 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-os-release\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493251 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjnlp\" (UniqueName: \"kubernetes.io/projected/c70b5e71-9ba3-4891-851c-653635c97ffb-kube-api-access-tjnlp\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysconfig\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493314 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2vk\" (UniqueName: \"kubernetes.io/projected/c35154e3-77e2-4f41-b5e8-99905ca385f9-kube-api-access-lg2vk\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-cni-binary-copy\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493370 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-host\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxtc\" (UniqueName: \"kubernetes.io/projected/0423948b-a820-4490-bb94-3810fdef3c06-kube-api-access-fnxtc\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-cnibin\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493458 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54h82\" (UniqueName: \"kubernetes.io/projected/84db4e3c-a849-4173-b21b-fbb75fd25be3-kube-api-access-54h82\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-socket-dir-parent\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.493980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-kubelet\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494201 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f41140f0-dc31-4907-aae1-f8d108bb517f-serviceca\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494236 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-ovn\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494372 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-netd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494401 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-env-overrides\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494435 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-systemd\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.494529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-hostroot\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-daemon-config\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-multus-certs\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.494776 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-slash\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494862 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494788 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-netns\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494932 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494859 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.494932 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-log-socket\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.495022 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.494964 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.495070 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-run\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.495070 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w752p\" (UniqueName: \"kubernetes.io/projected/6b433758-a3ea-42f0-997e-16cc16207047-kube-api-access-w752p\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.495158 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-systemd-units\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.495201 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-lib-modules\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.495279 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-system-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495329 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-kubelet\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495506 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-bin\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.495565 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.495610 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c35154e3-77e2-4f41-b5e8-99905ca385f9-tmp-dir\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.495653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495611 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.495700 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-cnibin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495700 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-netns\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495791 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-bin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495854 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-conf-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.495904 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8xf\" (UniqueName: \"kubernetes.io/projected/638d6e19-46c9-4d63-a7b2-461e842da022-kube-api-access-6b8xf\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.495953 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.495904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-etc-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.496114 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.496064 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-config\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.521679 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.521645 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:05 +0000 UTC" deadline="2027-10-29 04:00:34.317575866 +0000 UTC" Apr 16 04:24:06.521679 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.521678 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13463h36m27.795901522s" Apr 16 04:24:06.582816 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.582779 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 04:24:06.596588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mptj\" (UniqueName: \"kubernetes.io/projected/70cfcc27-c214-4fec-a12c-618606910cbf-kube-api-access-2mptj\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-multus\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-node-log\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-script-lib\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-modprobe-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596725 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-conf\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-multus\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.596773 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596765 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596771 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-node-log\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0423948b-a820-4490-bb94-3810fdef3c06-host-slash\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-os-release\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596887 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-modprobe-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98bbb6ac-6605-4f70-9681-333eea1951c2-agent-certs\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596953 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0423948b-a820-4490-bb94-3810fdef3c06-host-slash\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98bbb6ac-6605-4f70-9681-333eea1951c2-konnectivity-ca\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.596982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-conf\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-etc-tuned\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-tmp\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-os-release\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.597240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c35154e3-77e2-4f41-b5e8-99905ca385f9-hosts-file\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597362 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c35154e3-77e2-4f41-b5e8-99905ca385f9-hosts-file\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-k8s-cni-cncf-io\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-etc-kubernetes\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597428 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-k8s-cni-cncf-io\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovn-node-metrics-cert\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-script-lib\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlx8d\" (UniqueName: \"kubernetes.io/projected/fbe3cd6d-315c-4d44-81a3-217be3d98348-kube-api-access-zlx8d\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-etc-kubernetes\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98bbb6ac-6605-4f70-9681-333eea1951c2-konnectivity-ca\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-kubernetes\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597655 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597689 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-var-lib-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f41140f0-dc31-4907-aae1-f8d108bb517f-host\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597770 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxbj2\" (UniqueName: \"kubernetes.io/projected/f41140f0-dc31-4907-aae1-f8d108bb517f-kube-api-access-cxbj2\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-sys\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.597875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-device-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0423948b-a820-4490-bb94-3810fdef3c06-iptables-alerter-script\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-system-cni-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597937 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-os-release\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjnlp\" (UniqueName: \"kubernetes.io/projected/c70b5e71-9ba3-4891-851c-653635c97ffb-kube-api-access-tjnlp\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.597986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysconfig\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2vk\" (UniqueName: \"kubernetes.io/projected/c35154e3-77e2-4f41-b5e8-99905ca385f9-kube-api-access-lg2vk\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-cni-binary-copy\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-host\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxtc\" (UniqueName: \"kubernetes.io/projected/0423948b-a820-4490-bb94-3810fdef3c06-kube-api-access-fnxtc\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598142 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-cnibin\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54h82\" (UniqueName: \"kubernetes.io/projected/84db4e3c-a849-4173-b21b-fbb75fd25be3-kube-api-access-54h82\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-socket-dir-parent\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598258 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-kubelet\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598288 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f41140f0-dc31-4907-aae1-f8d108bb517f-serviceca\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.598638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598377 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-ovn\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.598422 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-netd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-env-overrides\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.598520 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:07.098485532 +0000 UTC m=+3.083935593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-systemd\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-sys-fs\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-hostroot\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-daemon-config\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-multus-certs\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-slash\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-netns\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598737 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f41140f0-dc31-4907-aae1-f8d108bb517f-serviceca\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-dbus\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.599417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-log-socket\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.598996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-run\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w752p\" (UniqueName: \"kubernetes.io/projected/6b433758-a3ea-42f0-997e-16cc16207047-kube-api-access-w752p\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-registration-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599089 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-systemd-units\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-lib-modules\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-system-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-kubelet\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-bin\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599345 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-run\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c35154e3-77e2-4f41-b5e8-99905ca385f9-tmp-dir\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599609 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-env-overrides\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-kubernetes\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-ovn\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-systemd-units\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-ovn-kubernetes\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-netd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599787 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-lib-modules\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599799 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f41140f0-dc31-4907-aae1-f8d108bb517f-host\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599853 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-system-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599819 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-var-lib-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-kubelet\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599932 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-cnibin\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-cni-bin\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600005 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-sys\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600011 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysctl-d\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-system-cni-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600100 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-socket-dir-parent\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600144 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-kubelet\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-systemd\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.600972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-hostroot\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0423948b-a820-4490-bb94-3810fdef3c06-iptables-alerter-script\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-os-release\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.600922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-etc-sysconfig\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601144 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-etc-tuned\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b433758-a3ea-42f0-997e-16cc16207047-tmp\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-run-netns\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84db4e3c-a849-4173-b21b-fbb75fd25be3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601268 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-multus-certs\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-host\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-host-slash\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-log-socket\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.599285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c35154e3-77e2-4f41-b5e8-99905ca385f9-tmp-dir\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601397 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-daemon-config\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-cnibin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601517 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-cnibin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-netns\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.601760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-bin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-run-netns\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-conf-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c70b5e71-9ba3-4891-851c-653635c97ffb-cni-binary-copy\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8xf\" (UniqueName: \"kubernetes.io/projected/638d6e19-46c9-4d63-a7b2-461e842da022-kube-api-access-6b8xf\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601860 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-host-var-lib-cni-bin\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-conf-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601888 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-kubelet-config\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-etc-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-config\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.601974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-var-lib-kubelet\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602000 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-etc-openvswitch\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602052 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/84db4e3c-a849-4173-b21b-fbb75fd25be3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-systemd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c70b5e71-9ba3-4891-851c-653635c97ffb-multus-cni-dir\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.602498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fbe3cd6d-315c-4d44-81a3-217be3d98348-run-systemd\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b433758-a3ea-42f0-997e-16cc16207047-var-lib-kubelet\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.602975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602201 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-socket-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.602975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovnkube-config\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.602975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.602919 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98bbb6ac-6605-4f70-9681-333eea1951c2-agent-certs\") pod \"konnectivity-agent-x8979\" (UID: \"98bbb6ac-6605-4f70-9681-333eea1951c2\") " pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.603983 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.603952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fbe3cd6d-315c-4d44-81a3-217be3d98348-ovn-node-metrics-cert\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.606505 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.606482 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:06.606505 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.606508 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:06.606618 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.606521 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:06.606618 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.606595 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:07.106573689 +0000 UTC m=+3.092023745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:06.610778 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.610659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxbj2\" (UniqueName: \"kubernetes.io/projected/f41140f0-dc31-4907-aae1-f8d108bb517f-kube-api-access-cxbj2\") pod \"node-ca-tns7g\" (UID: \"f41140f0-dc31-4907-aae1-f8d108bb517f\") " pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.610778 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.610676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2vk\" (UniqueName: \"kubernetes.io/projected/c35154e3-77e2-4f41-b5e8-99905ca385f9-kube-api-access-lg2vk\") pod \"node-resolver-7pjb8\" (UID: \"c35154e3-77e2-4f41-b5e8-99905ca385f9\") " pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.610778 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.610751 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxtc\" (UniqueName: \"kubernetes.io/projected/0423948b-a820-4490-bb94-3810fdef3c06-kube-api-access-fnxtc\") pod \"iptables-alerter-xgrdb\" (UID: \"0423948b-a820-4490-bb94-3810fdef3c06\") " pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.610778 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.610767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w752p\" (UniqueName: \"kubernetes.io/projected/6b433758-a3ea-42f0-997e-16cc16207047-kube-api-access-w752p\") pod \"tuned-qvg6g\" (UID: \"6b433758-a3ea-42f0-997e-16cc16207047\") " pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.611212 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.611195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjnlp\" (UniqueName: \"kubernetes.io/projected/c70b5e71-9ba3-4891-851c-653635c97ffb-kube-api-access-tjnlp\") pod \"multus-n4qkb\" (UID: \"c70b5e71-9ba3-4891-851c-653635c97ffb\") " pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.611263 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.611198 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54h82\" (UniqueName: \"kubernetes.io/projected/84db4e3c-a849-4173-b21b-fbb75fd25be3-kube-api-access-54h82\") pod \"multus-additional-cni-plugins-6wjzz\" (UID: \"84db4e3c-a849-4173-b21b-fbb75fd25be3\") " pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.611521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.611496 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8xf\" (UniqueName: \"kubernetes.io/projected/638d6e19-46c9-4d63-a7b2-461e842da022-kube-api-access-6b8xf\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:06.612647 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.612618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlx8d\" (UniqueName: \"kubernetes.io/projected/fbe3cd6d-315c-4d44-81a3-217be3d98348-kube-api-access-zlx8d\") pod \"ovnkube-node-s79zz\" (UID: \"fbe3cd6d-315c-4d44-81a3-217be3d98348\") " pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.675477 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.675411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" event={"ID":"558b1f9822c72f9ef387e865da1b3b63","Type":"ContainerStarted","Data":"54da9ed2cfcdbe47c14172d857c11974dda7439356208bc69c6bbf65548721be"} Apr 16 04:24:06.677704 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.677665 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" event={"ID":"4d62c969dcba3db9916e6f511688c3bd","Type":"ContainerStarted","Data":"0d609c9c14562eaa0473b434f852f4b1a84461fcf0132f3e2b80c955038b1919"} Apr 16 04:24:06.703406 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-sys-fs\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-dbus\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-registration-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-sys-fs\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703515 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-kubelet-config\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-kubelet-config\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-dbus\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-registration-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-socket-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703701 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mptj\" (UniqueName: \"kubernetes.io/projected/70cfcc27-c214-4fec-a12c-618606910cbf-kube-api-access-2mptj\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.703728 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:06.703791 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:07.203769549 +0000 UTC m=+3.189219596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-socket-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.703905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-device-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.704353 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-device-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.704353 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.703959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70cfcc27-c214-4fec-a12c-618606910cbf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.712569 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.712542 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mptj\" (UniqueName: \"kubernetes.io/projected/70cfcc27-c214-4fec-a12c-618606910cbf-kube-api-access-2mptj\") pod \"aws-ebs-csi-driver-node-cmsfr\" (UID: \"70cfcc27-c214-4fec-a12c-618606910cbf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:06.782488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.782398 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xgrdb" Apr 16 04:24:06.790562 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.790350 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7pjb8" Apr 16 04:24:06.804531 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.804499 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tns7g" Apr 16 04:24:06.810170 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.810144 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" Apr 16 04:24:06.816924 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.816904 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4qkb" Apr 16 04:24:06.824651 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.824633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:06.831371 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.831349 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:06.840025 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.840001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" Apr 16 04:24:06.846789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:06.846756 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" Apr 16 04:24:07.106939 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.106814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:07.106939 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.106908 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107001 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107023 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107031 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107035 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107094 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:08.107075321 +0000 UTC m=+4.092525374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:07.107148 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.107113 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:08.107104543 +0000 UTC m=+4.092554588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:07.207561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.207523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:07.207728 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.207681 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:07.207781 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.207754 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:08.207734683 +0000 UTC m=+4.193184741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:07.422361 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.422332 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98bbb6ac_6605_4f70_9681_333eea1951c2.slice/crio-573ea970aebc859950d508b23325335100cf2ad899893a940bead5756d28d17d WatchSource:0}: Error finding container 573ea970aebc859950d508b23325335100cf2ad899893a940bead5756d28d17d: Status 404 returned error can't find the container with id 573ea970aebc859950d508b23325335100cf2ad899893a940bead5756d28d17d Apr 16 04:24:07.439460 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.439412 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf41140f0_dc31_4907_aae1_f8d108bb517f.slice/crio-e1c44a14095f758a64078f43f8e088b91acd2217c28502a40b8e58579bf5a303 WatchSource:0}: Error finding container e1c44a14095f758a64078f43f8e088b91acd2217c28502a40b8e58579bf5a303: Status 404 returned error can't find the container with id e1c44a14095f758a64078f43f8e088b91acd2217c28502a40b8e58579bf5a303 Apr 16 04:24:07.440036 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.440007 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70b5e71_9ba3_4891_851c_653635c97ffb.slice/crio-825cf9a1a0796bb8c456a1f18a522d3cace7266cacca89700002d5fefb635676 WatchSource:0}: Error finding container 825cf9a1a0796bb8c456a1f18a522d3cace7266cacca89700002d5fefb635676: Status 404 returned error can't find the container with id 825cf9a1a0796bb8c456a1f18a522d3cace7266cacca89700002d5fefb635676 Apr 16 04:24:07.442479 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.442323 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35154e3_77e2_4f41_b5e8_99905ca385f9.slice/crio-1504cf003fe0e3951032e308675a356c2edcac17bd274ff04192badd26193438 WatchSource:0}: Error finding container 1504cf003fe0e3951032e308675a356c2edcac17bd274ff04192badd26193438: Status 404 returned error can't find the container with id 1504cf003fe0e3951032e308675a356c2edcac17bd274ff04192badd26193438 Apr 16 04:24:07.443264 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.443242 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84db4e3c_a849_4173_b21b_fbb75fd25be3.slice/crio-e04ca07ffc69e7bef04692a1e67550c775344c523e6782e2e25ebbff39870d0d WatchSource:0}: Error finding container e04ca07ffc69e7bef04692a1e67550c775344c523e6782e2e25ebbff39870d0d: Status 404 returned error can't find the container with id e04ca07ffc69e7bef04692a1e67550c775344c523e6782e2e25ebbff39870d0d Apr 16 04:24:07.445418 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.444810 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0423948b_a820_4490_bb94_3810fdef3c06.slice/crio-d2660bf1d5214ca0e9c9576f4d0169b9022fcfb7b247ccb6a74901357867fe73 WatchSource:0}: Error finding container d2660bf1d5214ca0e9c9576f4d0169b9022fcfb7b247ccb6a74901357867fe73: Status 404 returned error can't find the container with id d2660bf1d5214ca0e9c9576f4d0169b9022fcfb7b247ccb6a74901357867fe73 Apr 16 04:24:07.445702 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.445455 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe3cd6d_315c_4d44_81a3_217be3d98348.slice/crio-5fb17e4a5682d59f38f8afa258d5ef59c96f4feabe0429fdbeccd4ec17175af6 WatchSource:0}: Error finding container 5fb17e4a5682d59f38f8afa258d5ef59c96f4feabe0429fdbeccd4ec17175af6: Status 404 returned error can't find the container with id 5fb17e4a5682d59f38f8afa258d5ef59c96f4feabe0429fdbeccd4ec17175af6 Apr 16 04:24:07.446689 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.446661 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70cfcc27_c214_4fec_a12c_618606910cbf.slice/crio-c1b95ee4700f8acf4ef1bfe043a44740526550907c8e746d628e297741ce437f WatchSource:0}: Error finding container c1b95ee4700f8acf4ef1bfe043a44740526550907c8e746d628e297741ce437f: Status 404 returned error can't find the container with id c1b95ee4700f8acf4ef1bfe043a44740526550907c8e746d628e297741ce437f Apr 16 04:24:07.447912 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:24:07.447884 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b433758_a3ea_42f0_997e_16cc16207047.slice/crio-b29eb4a8dc1e87799265e270d0b00c72223bfc09af394c5e8dd32709b7f095bd WatchSource:0}: Error finding container b29eb4a8dc1e87799265e270d0b00c72223bfc09af394c5e8dd32709b7f095bd: Status 404 returned error can't find the container with id b29eb4a8dc1e87799265e270d0b00c72223bfc09af394c5e8dd32709b7f095bd Apr 16 04:24:07.522595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.522561 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 04:19:05 +0000 UTC" deadline="2027-11-03 19:27:01.464448955 +0000 UTC" Apr 16 04:24:07.522595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.522592 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13599h2m53.941859714s" Apr 16 04:24:07.669896 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.669762 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:07.670061 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:07.669904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:07.680691 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.680663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerStarted","Data":"e04ca07ffc69e7bef04692a1e67550c775344c523e6782e2e25ebbff39870d0d"} Apr 16 04:24:07.681558 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.681524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tns7g" event={"ID":"f41140f0-dc31-4907-aae1-f8d108bb517f","Type":"ContainerStarted","Data":"e1c44a14095f758a64078f43f8e088b91acd2217c28502a40b8e58579bf5a303"} Apr 16 04:24:07.682586 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.682549 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8979" event={"ID":"98bbb6ac-6605-4f70-9681-333eea1951c2","Type":"ContainerStarted","Data":"573ea970aebc859950d508b23325335100cf2ad899893a940bead5756d28d17d"} Apr 16 04:24:07.684165 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.684136 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" event={"ID":"558b1f9822c72f9ef387e865da1b3b63","Type":"ContainerStarted","Data":"39ba7ea9ec15ce35344510af86b7749366e5e2d47a66b5320238e52bbab391d0"} Apr 16 04:24:07.685292 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.685271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" event={"ID":"6b433758-a3ea-42f0-997e-16cc16207047","Type":"ContainerStarted","Data":"b29eb4a8dc1e87799265e270d0b00c72223bfc09af394c5e8dd32709b7f095bd"} Apr 16 04:24:07.688393 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.688359 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4qkb" event={"ID":"c70b5e71-9ba3-4891-851c-653635c97ffb","Type":"ContainerStarted","Data":"825cf9a1a0796bb8c456a1f18a522d3cace7266cacca89700002d5fefb635676"} Apr 16 04:24:07.689809 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.689785 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"5fb17e4a5682d59f38f8afa258d5ef59c96f4feabe0429fdbeccd4ec17175af6"} Apr 16 04:24:07.690732 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.690707 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7pjb8" event={"ID":"c35154e3-77e2-4f41-b5e8-99905ca385f9","Type":"ContainerStarted","Data":"1504cf003fe0e3951032e308675a356c2edcac17bd274ff04192badd26193438"} Apr 16 04:24:07.691712 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.691694 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" event={"ID":"70cfcc27-c214-4fec-a12c-618606910cbf","Type":"ContainerStarted","Data":"c1b95ee4700f8acf4ef1bfe043a44740526550907c8e746d628e297741ce437f"} Apr 16 04:24:07.692632 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.692613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xgrdb" event={"ID":"0423948b-a820-4490-bb94-3810fdef3c06","Type":"ContainerStarted","Data":"d2660bf1d5214ca0e9c9576f4d0169b9022fcfb7b247ccb6a74901357867fe73"} Apr 16 04:24:07.699446 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:07.699405 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-81.ec2.internal" podStartSLOduration=2.699391539 podStartE2EDuration="2.699391539s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:07.698887753 +0000 UTC m=+3.684337814" watchObservedRunningTime="2026-04-16 04:24:07.699391539 +0000 UTC m=+3.684841593" Apr 16 04:24:08.114419 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.114331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:08.114419 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.114392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:08.114638 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.114594 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:08.114689 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.114661 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:10.114642285 +0000 UTC m=+6.100092350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:08.115142 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.115121 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:08.115249 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.115149 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:08.115249 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.115162 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:08.115249 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.115205 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:10.115191319 +0000 UTC m=+6.100641378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:08.215339 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.215295 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:08.215529 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.215512 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:08.215593 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.215581 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:10.215562984 +0000 UTC m=+6.201013028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:08.670553 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.669977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:08.670553 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.670119 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:08.673793 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.671236 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:08.673793 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:08.671337 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:08.709604 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.709566 2567 generic.go:358] "Generic (PLEG): container finished" podID="4d62c969dcba3db9916e6f511688c3bd" containerID="3b92c3643ea96921dec104f78c8c2cdbf5b65f56974978c3603dfbff81ff6a8e" exitCode=0 Apr 16 04:24:08.709775 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:08.709679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" event={"ID":"4d62c969dcba3db9916e6f511688c3bd","Type":"ContainerDied","Data":"3b92c3643ea96921dec104f78c8c2cdbf5b65f56974978c3603dfbff81ff6a8e"} Apr 16 04:24:09.669908 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:09.669872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:09.670149 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:09.670010 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:09.725621 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:09.725584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" event={"ID":"4d62c969dcba3db9916e6f511688c3bd","Type":"ContainerStarted","Data":"edc8b7a28acb1c41546ec7d46b6caba5ecca835c5125294faff7bf5bd585bb0d"} Apr 16 04:24:10.132567 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:10.132479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:10.132743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:10.132606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:10.132819 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.132755 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:10.132819 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.132772 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:10.132819 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.132785 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:10.133058 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.132862 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:14.132841429 +0000 UTC m=+10.118291486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:10.133305 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.133284 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:10.133378 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.133343 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:14.13332687 +0000 UTC m=+10.118776916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:10.233523 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:10.233484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:10.233716 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.233624 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:10.233716 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.233697 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:14.233676436 +0000 UTC m=+10.219126482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:10.673625 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:10.673593 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:10.673795 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.673723 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:10.674168 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:10.674147 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:10.674302 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:10.674272 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:11.669619 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:11.669585 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:11.670173 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:11.669735 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:12.669725 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:12.669685 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:12.670191 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:12.669849 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:12.670460 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:12.670300 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:12.670460 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:12.670401 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:13.669857 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:13.669491 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:13.669857 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:13.669625 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:14.169129 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:14.169195 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169340 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169406 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:22.169384629 +0000 UTC m=+18.154834674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169493 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169508 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169520 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:14.169911 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.169559 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:22.169547455 +0000 UTC m=+18.154997500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:14.270068 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:14.269967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:14.270248 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.270129 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:14.270248 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.270208 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:22.270184977 +0000 UTC m=+18.255635020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:14.673779 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:14.673751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:14.674247 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.673885 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:14.674318 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:14.674269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:14.674433 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:14.674364 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:15.669577 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:15.669542 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:15.669770 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:15.669662 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:16.670189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:16.670093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:16.670189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:16.670143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:16.670669 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:16.670252 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:16.670669 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:16.670385 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:17.670079 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:17.670034 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:17.670256 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:17.670185 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:18.669845 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:18.669796 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:18.669845 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:18.669815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:18.670091 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:18.669986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:18.670091 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:18.670081 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:19.669908 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:19.669872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:19.670350 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:19.669983 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:20.670340 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:20.670298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:20.670774 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:20.670316 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:20.670774 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:20.670438 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:20.670774 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:20.670530 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:21.670039 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:21.669995 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:21.670219 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:21.670139 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:22.226355 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:22.226315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:22.226368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226495 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226513 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226534 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226545 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226599 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.226578168 +0000 UTC m=+34.212028218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:22.226811 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.226620 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.226609384 +0000 UTC m=+34.212059427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:22.326929 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:22.326892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:22.327123 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.327052 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:22.327182 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.327139 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.327119617 +0000 UTC m=+34.312569684 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:22.669842 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:22.669795 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:22.670025 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.669945 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:22.670025 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:22.670007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:22.670155 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:22.670131 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:23.669333 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:23.669288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:23.669799 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:23.669427 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:24.671358 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.671025 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:24.672083 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.671084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:24.672083 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:24.671477 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:24.672083 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:24.671570 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:24.751388 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.751347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7pjb8" event={"ID":"c35154e3-77e2-4f41-b5e8-99905ca385f9","Type":"ContainerStarted","Data":"fc5751d45870edd85bf667a960c6db89abf4fbc4f014486d60436aa99775fa3e"} Apr 16 04:24:24.753019 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.752988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" event={"ID":"70cfcc27-c214-4fec-a12c-618606910cbf","Type":"ContainerStarted","Data":"29278390a328c816e48257c2c2fbef179c4ae82a28903915c27c1879b571977e"} Apr 16 04:24:24.754613 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.754585 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="0c7ac2aa4d774349f6fe44a54858e647415555ad8dcb863d8d1412aa247616e2" exitCode=0 Apr 16 04:24:24.754728 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.754639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"0c7ac2aa4d774349f6fe44a54858e647415555ad8dcb863d8d1412aa247616e2"} Apr 16 04:24:24.756313 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.756204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tns7g" event={"ID":"f41140f0-dc31-4907-aae1-f8d108bb517f","Type":"ContainerStarted","Data":"df065ba5abd60581fbf8440f770d55eb088e71299d7d7bde29f76f26ce5283dc"} Apr 16 04:24:24.757694 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.757657 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8979" event={"ID":"98bbb6ac-6605-4f70-9681-333eea1951c2","Type":"ContainerStarted","Data":"9e400f8bb10561a0ca7cb4393d70ba009b3c8299a4bb723694fa8b5093ae8fc1"} Apr 16 04:24:24.759097 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.759074 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" event={"ID":"6b433758-a3ea-42f0-997e-16cc16207047","Type":"ContainerStarted","Data":"8b278ac66b366e34be7bcf23cb0d7f6be01026c68d55b644a0cccabff84e3d86"} Apr 16 04:24:24.760679 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.760659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4qkb" event={"ID":"c70b5e71-9ba3-4891-851c-653635c97ffb","Type":"ContainerStarted","Data":"38f5b3a66dc2fb6ff807bdeb43586ed6d7025b6f13e47474035d9b3fa94c4388"} Apr 16 04:24:24.762492 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762475 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:24:24.762744 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762725 2567 generic.go:358] "Generic (PLEG): container finished" podID="fbe3cd6d-315c-4d44-81a3-217be3d98348" containerID="2ea0d613b1530848e1f6ad372fdf31322532e54b016d65b3f19516507c96cb0c" exitCode=1 Apr 16 04:24:24.762817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"943c9992a2ba1279164954cbe48c57e6091dc3aa223bce81121e20c4bc47680b"} Apr 16 04:24:24.762817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762771 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"0dd49358d508a381fa43794864a48f6ad3dd305a3ffd4aac0999896e7a3e324f"} Apr 16 04:24:24.762817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerDied","Data":"2ea0d613b1530848e1f6ad372fdf31322532e54b016d65b3f19516507c96cb0c"} Apr 16 04:24:24.762817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.762792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"4f49e49253519146efa1d30ec5c9bcc1229b0638c4ad59a6b88bbf7664fe625c"} Apr 16 04:24:24.764693 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.764653 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7pjb8" podStartSLOduration=4.092695768 podStartE2EDuration="20.764642896s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.444242122 +0000 UTC m=+3.429692168" lastFinishedPulling="2026-04-16 04:24:24.116189254 +0000 UTC m=+20.101639296" observedRunningTime="2026-04-16 04:24:24.764624697 +0000 UTC m=+20.750074773" watchObservedRunningTime="2026-04-16 04:24:24.764642896 +0000 UTC m=+20.750092959" Apr 16 04:24:24.765038 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.765015 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-81.ec2.internal" podStartSLOduration=19.765009815 podStartE2EDuration="19.765009815s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:24:09.74175978 +0000 UTC m=+5.727209845" watchObservedRunningTime="2026-04-16 04:24:24.765009815 +0000 UTC m=+20.750459880" Apr 16 04:24:24.803123 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.803066 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n4qkb" podStartSLOduration=4.075755127 podStartE2EDuration="20.803044627s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.443315028 +0000 UTC m=+3.428765072" lastFinishedPulling="2026-04-16 04:24:24.170604519 +0000 UTC m=+20.156054572" observedRunningTime="2026-04-16 04:24:24.802305361 +0000 UTC m=+20.787755440" watchObservedRunningTime="2026-04-16 04:24:24.803044627 +0000 UTC m=+20.788494697" Apr 16 04:24:24.817448 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.817399 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qvg6g" podStartSLOduration=4.14871412 podStartE2EDuration="20.817385025s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.449600353 +0000 UTC m=+3.435050395" lastFinishedPulling="2026-04-16 04:24:24.118271251 +0000 UTC m=+20.103721300" observedRunningTime="2026-04-16 04:24:24.817181155 +0000 UTC m=+20.802631229" watchObservedRunningTime="2026-04-16 04:24:24.817385025 +0000 UTC m=+20.802835089" Apr 16 04:24:24.831818 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.831771 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tns7g" podStartSLOduration=11.937292149 podStartE2EDuration="20.831759839s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.442626857 +0000 UTC m=+3.428076899" lastFinishedPulling="2026-04-16 04:24:16.337094525 +0000 UTC m=+12.322544589" observedRunningTime="2026-04-16 04:24:24.831372104 +0000 UTC m=+20.816822164" watchObservedRunningTime="2026-04-16 04:24:24.831759839 +0000 UTC m=+20.817209904" Apr 16 04:24:24.850787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:24.850745 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-x8979" podStartSLOduration=4.274490972 podStartE2EDuration="20.850730202s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.431479272 +0000 UTC m=+3.416929314" lastFinishedPulling="2026-04-16 04:24:24.007718502 +0000 UTC m=+19.993168544" observedRunningTime="2026-04-16 04:24:24.849887903 +0000 UTC m=+20.835337966" watchObservedRunningTime="2026-04-16 04:24:24.850730202 +0000 UTC m=+20.836180265" Apr 16 04:24:25.670309 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.670276 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:25.670485 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:25.670434 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:25.767619 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.767592 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:24:25.768132 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.768023 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"73013e99eb8fe789fc99c3e58595a2960d86a83939c934ddc0f242566137bc34"} Apr 16 04:24:25.768132 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.768065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"efbaef8ba336516c3db5d3f776daae820e76b16a6357279b3f1fd84d4753eff9"} Apr 16 04:24:25.769439 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.769413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xgrdb" event={"ID":"0423948b-a820-4490-bb94-3810fdef3c06","Type":"ContainerStarted","Data":"bd3c09c4d88d18f715a16b08509fd9b9b035837acc209d72e104a783b665a2f1"} Apr 16 04:24:25.782963 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.782898 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xgrdb" podStartSLOduration=5.113354636 podStartE2EDuration="21.78287684s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.446602965 +0000 UTC m=+3.432053017" lastFinishedPulling="2026-04-16 04:24:24.116125168 +0000 UTC m=+20.101575221" observedRunningTime="2026-04-16 04:24:25.782676177 +0000 UTC m=+21.768126244" watchObservedRunningTime="2026-04-16 04:24:25.78287684 +0000 UTC m=+21.768326906" Apr 16 04:24:25.806729 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:25.806699 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 04:24:26.556556 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.556224 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T04:24:25.806721362Z","UUID":"027d5e8b-24f0-4e99-8aa3-40842a3d4b36","Handler":null,"Name":"","Endpoint":""} Apr 16 04:24:26.558885 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.558859 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 04:24:26.559486 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.559433 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 04:24:26.670280 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.670183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:26.670280 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.670216 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:26.670542 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:26.670343 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:26.670542 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:26.670466 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:26.773972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:26.773927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" event={"ID":"70cfcc27-c214-4fec-a12c-618606910cbf","Type":"ContainerStarted","Data":"fc28e0a6a14221ae4f3a029b860e5d8272610662664a11b03d803727314be256"} Apr 16 04:24:27.493805 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.493762 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:27.494456 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.494439 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:27.669911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.669877 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:27.670081 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:27.669982 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:27.778602 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.778529 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:24:27.779240 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.778905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"555453a3bb1c827ff796c0d5ebe6ee1f791f86e2ea127a0eeac540d1f3249af9"} Apr 16 04:24:27.780893 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.780864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" event={"ID":"70cfcc27-c214-4fec-a12c-618606910cbf","Type":"ContainerStarted","Data":"35ac453e2959bf8dd122059f4e28f39d51754e49f6e5bcc2424d83d08695c4a7"} Apr 16 04:24:27.799329 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:27.799272 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmsfr" podStartSLOduration=3.220665027 podStartE2EDuration="22.79925747s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.449207568 +0000 UTC m=+3.434657609" lastFinishedPulling="2026-04-16 04:24:27.027799998 +0000 UTC m=+23.013250052" observedRunningTime="2026-04-16 04:24:27.798882543 +0000 UTC m=+23.784332607" watchObservedRunningTime="2026-04-16 04:24:27.79925747 +0000 UTC m=+23.784707534" Apr 16 04:24:28.669919 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:28.669878 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:28.670105 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:28.669927 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:28.670105 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:28.670015 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:28.670224 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:28.670139 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:29.670390 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.670206 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:29.670922 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:29.670464 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:29.787636 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.787610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:24:29.787974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.787947 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"5250c320cde458b213be89c210bb096eea766f2b307771a19a1f02ae4a0b73f8"} Apr 16 04:24:29.788339 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.788311 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:29.788450 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.788434 2567 scope.go:117] "RemoveContainer" containerID="2ea0d613b1530848e1f6ad372fdf31322532e54b016d65b3f19516507c96cb0c" Apr 16 04:24:29.789653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.789603 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="2bf52d1dc731808f131cf30447c626e28b9b636d4ba2b167ab6220b419d8abc5" exitCode=0 Apr 16 04:24:29.789653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.789643 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"2bf52d1dc731808f131cf30447c626e28b9b636d4ba2b167ab6220b419d8abc5"} Apr 16 04:24:29.804612 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:29.804593 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:30.669417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.669387 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:30.669583 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:30.669498 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:30.669583 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.669538 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:30.669690 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:30.669607 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:30.794046 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.794018 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="2ac895de86139a6494a25cdf44d5523d8859589ff76f5a2ae61c7a04a616c334" exitCode=0 Apr 16 04:24:30.794464 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.794104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"2ac895de86139a6494a25cdf44d5523d8859589ff76f5a2ae61c7a04a616c334"} Apr 16 04:24:30.797950 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.797932 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:24:30.798282 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.798259 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" event={"ID":"fbe3cd6d-315c-4d44-81a3-217be3d98348","Type":"ContainerStarted","Data":"456592783a9627226720ffae37612cbbaf0b2afbf8220a591c4e0678fb94158b"} Apr 16 04:24:30.798417 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.798405 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:30.798661 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.798638 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:30.814102 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.814053 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:30.841822 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.841772 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" podStartSLOduration=10.098341506 podStartE2EDuration="26.84175694s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.44756938 +0000 UTC m=+3.433019422" lastFinishedPulling="2026-04-16 04:24:24.190984803 +0000 UTC m=+20.176434856" observedRunningTime="2026-04-16 04:24:30.841450262 +0000 UTC m=+26.826900329" watchObservedRunningTime="2026-04-16 04:24:30.84175694 +0000 UTC m=+26.827207004" Apr 16 04:24:30.923557 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:30.923531 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:24:31.195043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.194976 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28th9"] Apr 16 04:24:31.195161 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.195090 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:31.195195 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:31.195169 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:31.198373 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.198343 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6hlh"] Apr 16 04:24:31.198507 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.198471 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:31.198630 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:31.198606 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:31.199442 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.199417 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2zb5"] Apr 16 04:24:31.199601 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.199585 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:31.199715 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:31.199693 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:31.802437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.802251 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="893cd30c2fdfaebcfe804d7d258bcfdfa0cbbffc6d1ff4124a22d065b8cdc643" exitCode=0 Apr 16 04:24:31.802437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:31.802325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"893cd30c2fdfaebcfe804d7d258bcfdfa0cbbffc6d1ff4124a22d065b8cdc643"} Apr 16 04:24:32.374033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.373997 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:32.374221 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.374161 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 04:24:32.374804 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.374751 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-x8979" Apr 16 04:24:32.670158 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.670084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:32.670362 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.670084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:32.670362 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:32.670213 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:32.670362 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:32.670081 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:32.670362 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:32.670305 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:32.670572 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:32.670358 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:34.671222 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:34.671182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:34.672016 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:34.671280 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:34.672016 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:34.671305 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:34.672016 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:34.671337 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:34.672016 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:34.671350 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:34.672016 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:34.671389 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:36.670125 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:36.670084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:36.670613 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:36.670084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:36.670613 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:36.670223 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6hlh" podUID="638d6e19-46c9-4d63-a7b2-461e842da022" Apr 16 04:24:36.670613 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:36.670085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:36.670613 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:36.670302 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-28th9" podUID="dcdc9bd2-6cdd-48d1-850f-80adbc878d5f" Apr 16 04:24:36.670613 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:36.670391 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2zb5" podUID="c1c49d2a-98d1-4d28-9e17-3967b6431a92" Apr 16 04:24:37.307290 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.307261 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-81.ec2.internal" event="NodeReady" Apr 16 04:24:37.307458 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.307385 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 04:24:37.338713 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.338678 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-589df4bd79-gpnbv"] Apr 16 04:24:37.354867 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.354820 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-589df4bd79-gpnbv"] Apr 16 04:24:37.355015 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.354968 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.358214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.358148 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hvkh7\"" Apr 16 04:24:37.358214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.358163 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 04:24:37.358400 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.358171 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 04:24:37.358400 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.358232 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 04:24:37.358605 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.358587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqxs"] Apr 16 04:24:37.362565 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.362544 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 04:24:37.386360 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.386331 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m5jm4"] Apr 16 04:24:37.386525 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.386479 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.389436 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.389413 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 04:24:37.389584 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.389484 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 04:24:37.389584 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.389503 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 04:24:37.389720 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.389704 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:24:37.400568 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.400543 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqxs"] Apr 16 04:24:37.400568 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.400569 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m5jm4"] Apr 16 04:24:37.400742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.400672 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.403234 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.403215 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 04:24:37.403363 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.403346 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 04:24:37.403447 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.403429 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:24:37.441356 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441367 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441474 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441675 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441536 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441675 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx76m\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441675 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.441795 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.441680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542004 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.541966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542166 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542166 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542166 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/390e05c5-2dbf-454b-872e-6a8969a124ae-config-volume\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.542166 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.542297 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542297 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542237 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/390e05c5-2dbf-454b-872e-6a8969a124ae-tmp-dir\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.542297 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx76m\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542429 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542429 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542354 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhwv\" (UniqueName: \"kubernetes.io/projected/390e05c5-2dbf-454b-872e-6a8969a124ae-kube-api-access-xrhwv\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.542429 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.542405 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:37.542429 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.542429 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.542428 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:37.542692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frrwf\" (UniqueName: \"kubernetes.io/projected/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-kube-api-access-frrwf\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.542692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542692 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.542547 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.042525188 +0000 UTC m=+34.027975245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:37.542692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542894 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.542894 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.542861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.543434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.543415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.546203 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.546181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.546313 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.546224 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.567973 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.567899 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.578354 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.578327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx76m\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:37.644077 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.643800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/390e05c5-2dbf-454b-872e-6a8969a124ae-config-volume\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.644244 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.644244 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/390e05c5-2dbf-454b-872e-6a8969a124ae-tmp-dir\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.644244 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhwv\" (UniqueName: \"kubernetes.io/projected/390e05c5-2dbf-454b-872e-6a8969a124ae-kube-api-access-xrhwv\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.644244 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.644442 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frrwf\" (UniqueName: \"kubernetes.io/projected/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-kube-api-access-frrwf\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.644442 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.644320 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:37.644442 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.644387 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:37.644442 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644409 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/390e05c5-2dbf-454b-872e-6a8969a124ae-config-volume\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.644442 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.644423 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.144399906 +0000 UTC m=+34.129849968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:24:37.644616 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:37.644451 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:24:38.144430368 +0000 UTC m=+34.129880431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:37.644616 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.644521 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/390e05c5-2dbf-454b-872e-6a8969a124ae-tmp-dir\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.653623 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.653591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhwv\" (UniqueName: \"kubernetes.io/projected/390e05c5-2dbf-454b-872e-6a8969a124ae-kube-api-access-xrhwv\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:37.653809 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.653788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frrwf\" (UniqueName: \"kubernetes.io/projected/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-kube-api-access-frrwf\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:37.817756 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:37.817723 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerStarted","Data":"49c452d08537999834b04298528ac8717bd1d3cad019821eb59c1fda38e0d9dc"} Apr 16 04:24:38.047057 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.047018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:38.047218 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.047174 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:38.047218 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.047188 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:38.047291 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.047244 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:24:39.04722744 +0000 UTC m=+35.032677486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:38.148146 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.148068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:38.148270 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.148213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:38.148270 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.148214 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:38.148333 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.148276 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:24:39.148258533 +0000 UTC m=+35.133708575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:38.148333 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.148295 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:38.148333 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.148332 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:24:39.148320675 +0000 UTC m=+35.133770716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:24:38.248877 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.248819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:38.249053 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.248965 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:38.249053 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.249039 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:10.24902175 +0000 UTC m=+66.234471794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 04:24:38.249141 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.249072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:38.249205 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.249178 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 04:24:38.249205 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.249200 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 04:24:38.249272 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.249212 2567 projected.go:194] Error preparing data for projected volume kube-api-access-w2kbm for pod openshift-network-diagnostics/network-check-target-b2zb5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:38.249272 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.249253 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm podName:c1c49d2a-98d1-4d28-9e17-3967b6431a92 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:10.24924167 +0000 UTC m=+66.234691717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2kbm" (UniqueName: "kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm") pod "network-check-target-b2zb5" (UID: "c1c49d2a-98d1-4d28-9e17-3967b6431a92") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 04:24:38.349890 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.349850 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:38.350075 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.349931 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:38.350075 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:38.349997 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret podName:dcdc9bd2-6cdd-48d1-850f-80adbc878d5f nodeName:}" failed. No retries permitted until 2026-04-16 04:25:10.349982566 +0000 UTC m=+66.335432628 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret") pod "global-pull-secret-syncer-28th9" (UID: "dcdc9bd2-6cdd-48d1-850f-80adbc878d5f") : object "kube-system"/"original-pull-secret" not registered Apr 16 04:24:38.669716 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.669680 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:24:38.669932 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.669680 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:24:38.670030 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.669698 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:24:38.672583 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672565 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 04:24:38.672708 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672565 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 04:24:38.672708 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672696 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 04:24:38.672817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672717 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:24:38.672817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672731 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l4rjf\"" Apr 16 04:24:38.672817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.672795 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 04:24:38.822147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.822060 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="49c452d08537999834b04298528ac8717bd1d3cad019821eb59c1fda38e0d9dc" exitCode=0 Apr 16 04:24:38.822147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:38.822103 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"49c452d08537999834b04298528ac8717bd1d3cad019821eb59c1fda38e0d9dc"} Apr 16 04:24:39.054701 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:39.054671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:39.054923 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.054817 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:39.054923 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.054847 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:39.054923 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.054905 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:24:41.054888128 +0000 UTC m=+37.040338170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:39.155291 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:39.155195 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:39.155441 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.155365 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:39.155441 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:39.155376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:39.155441 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.155427 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:24:41.155410339 +0000 UTC m=+37.140860381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:39.155573 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.155475 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:39.155573 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:39.155517 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:24:41.155505972 +0000 UTC m=+37.140956013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:24:39.826237 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:39.826157 2567 generic.go:358] "Generic (PLEG): container finished" podID="84db4e3c-a849-4173-b21b-fbb75fd25be3" containerID="fa34b58ff8c2db372a1abb1931970c08e23e6334175dea7ab310a5790ff92f35" exitCode=0 Apr 16 04:24:39.826237 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:39.826215 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerDied","Data":"fa34b58ff8c2db372a1abb1931970c08e23e6334175dea7ab310a5790ff92f35"} Apr 16 04:24:40.830910 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:40.830873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" event={"ID":"84db4e3c-a849-4173-b21b-fbb75fd25be3","Type":"ContainerStarted","Data":"138d8a87af474df22e451c991a3e192bb6f32c939cdeb0856ae21875e5f3e51e"} Apr 16 04:24:40.853381 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:40.853292 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6wjzz" podStartSLOduration=6.700258938 podStartE2EDuration="36.853276317s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:24:07.445043448 +0000 UTC m=+3.430493495" lastFinishedPulling="2026-04-16 04:24:37.598060827 +0000 UTC m=+33.583510874" observedRunningTime="2026-04-16 04:24:40.851274547 +0000 UTC m=+36.836724637" watchObservedRunningTime="2026-04-16 04:24:40.853276317 +0000 UTC m=+36.838726380" Apr 16 04:24:41.072364 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:41.072276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:41.072507 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.072431 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:41.072507 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.072450 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:41.072584 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.072549 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:24:45.072532004 +0000 UTC m=+41.057982045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:41.173061 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:41.173023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:41.173178 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:41.173082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:41.173178 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.173173 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:41.173256 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.173174 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:41.173256 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.173224 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:24:45.17320938 +0000 UTC m=+41.158659422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:41.173256 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:41.173237 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:24:45.173231283 +0000 UTC m=+41.158681324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:24:45.101190 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:45.101147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:45.101610 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.101265 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:45.101610 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.101277 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:45.101610 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.101329 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.101314663 +0000 UTC m=+49.086764705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:45.201677 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:45.201637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:45.201848 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:45.201699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:45.201848 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.201790 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:45.201848 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.201803 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:45.201979 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.201860 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.201845092 +0000 UTC m=+49.187295152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:45.201979 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:45.201890 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:24:53.201872178 +0000 UTC m=+49.187322225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:24:53.161990 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:53.161944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:24:53.162380 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.162086 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:24:53.162380 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.162107 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:24:53.162380 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.162178 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.162159687 +0000 UTC m=+65.147609729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:24:53.263058 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:53.263020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:24:53.263220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:24:53.263074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:24:53.263220 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.263196 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:24:53.263288 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.263214 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:24:53.263288 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.263253 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.263239947 +0000 UTC m=+65.248689990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:24:53.263288 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:24:53.263280 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:25:09.263265167 +0000 UTC m=+65.248715209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:25:02.814689 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:02.814658 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s79zz" Apr 16 04:25:09.178495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:09.178451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:25:09.179020 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.178610 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:09.179020 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.178624 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:25:09.179020 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.178683 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.178664428 +0000 UTC m=+97.164114487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:25:09.278908 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:09.278869 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:25:09.279071 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:09.278934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:25:09.279071 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.279020 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:09.279140 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.279101 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.279083353 +0000 UTC m=+97.264533394 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:25:09.279140 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.279027 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:09.279212 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:09.279173 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.279161488 +0000 UTC m=+97.264611529 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:25:10.285668 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.285614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:25:10.285668 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.285668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:25:10.288905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.288878 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 04:25:10.288986 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.288904 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 04:25:10.296283 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:10.296237 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 04:25:10.296361 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:10.296331 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs podName:638d6e19-46c9-4d63-a7b2-461e842da022 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:14.296312037 +0000 UTC m=+130.281762083 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs") pod "network-metrics-daemon-j6hlh" (UID: "638d6e19-46c9-4d63-a7b2-461e842da022") : secret "metrics-daemon-secret" not found Apr 16 04:25:10.299105 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.299084 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 04:25:10.314242 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.314212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kbm\" (UniqueName: \"kubernetes.io/projected/c1c49d2a-98d1-4d28-9e17-3967b6431a92-kube-api-access-w2kbm\") pod \"network-check-target-b2zb5\" (UID: \"c1c49d2a-98d1-4d28-9e17-3967b6431a92\") " pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:25:10.386569 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.386528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:25:10.389651 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.389631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 04:25:10.399214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.399179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dcdc9bd2-6cdd-48d1-850f-80adbc878d5f-original-pull-secret\") pod \"global-pull-secret-syncer-28th9\" (UID: \"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f\") " pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:25:10.490032 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.489995 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-28th9" Apr 16 04:25:10.493124 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.493104 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l4rjf\"" Apr 16 04:25:10.500416 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.500386 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:25:10.644705 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.644671 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2zb5"] Apr 16 04:25:10.647859 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:10.647811 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c49d2a_98d1_4d28_9e17_3967b6431a92.slice/crio-c4c008b5f791fc52369fc5c1b01b904aaf9d2f801c7b668e64b3d4b9edba5318 WatchSource:0}: Error finding container c4c008b5f791fc52369fc5c1b01b904aaf9d2f801c7b668e64b3d4b9edba5318: Status 404 returned error can't find the container with id c4c008b5f791fc52369fc5c1b01b904aaf9d2f801c7b668e64b3d4b9edba5318 Apr 16 04:25:10.658550 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.658520 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-28th9"] Apr 16 04:25:10.662008 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:10.661976 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcdc9bd2_6cdd_48d1_850f_80adbc878d5f.slice/crio-486cf8f162a67777d8fba21c0b8088683e6b949e2d78f1755e660b8ae35d5cbe WatchSource:0}: Error finding container 486cf8f162a67777d8fba21c0b8088683e6b949e2d78f1755e660b8ae35d5cbe: Status 404 returned error can't find the container with id 486cf8f162a67777d8fba21c0b8088683e6b949e2d78f1755e660b8ae35d5cbe Apr 16 04:25:10.887799 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.887696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2zb5" event={"ID":"c1c49d2a-98d1-4d28-9e17-3967b6431a92","Type":"ContainerStarted","Data":"c4c008b5f791fc52369fc5c1b01b904aaf9d2f801c7b668e64b3d4b9edba5318"} Apr 16 04:25:10.888722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:10.888696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28th9" event={"ID":"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f","Type":"ContainerStarted","Data":"486cf8f162a67777d8fba21c0b8088683e6b949e2d78f1755e660b8ae35d5cbe"} Apr 16 04:25:15.901495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:15.901458 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-28th9" event={"ID":"dcdc9bd2-6cdd-48d1-850f-80adbc878d5f","Type":"ContainerStarted","Data":"5b96c00d7d739659f45270bb05d0a837413b54fda3869115decabc9607187cc8"} Apr 16 04:25:15.902777 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:15.902752 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2zb5" event={"ID":"c1c49d2a-98d1-4d28-9e17-3967b6431a92","Type":"ContainerStarted","Data":"52f640685473ca65f5ad24b0cd9016961f90e6306f9c7c565c41569bd28274a6"} Apr 16 04:25:15.902905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:15.902887 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:25:15.916598 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:15.916547 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-28th9" podStartSLOduration=66.668291629 podStartE2EDuration="1m10.91653327s" podCreationTimestamp="2026-04-16 04:24:05 +0000 UTC" firstStartedPulling="2026-04-16 04:25:10.663690348 +0000 UTC m=+66.649140389" lastFinishedPulling="2026-04-16 04:25:14.911931972 +0000 UTC m=+70.897382030" observedRunningTime="2026-04-16 04:25:15.916216554 +0000 UTC m=+71.901666617" watchObservedRunningTime="2026-04-16 04:25:15.91653327 +0000 UTC m=+71.901983334" Apr 16 04:25:15.929622 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:15.929566 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b2zb5" podStartSLOduration=67.67077258 podStartE2EDuration="1m11.929550555s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:25:10.649723388 +0000 UTC m=+66.635173430" lastFinishedPulling="2026-04-16 04:25:14.908501359 +0000 UTC m=+70.893951405" observedRunningTime="2026-04-16 04:25:15.929137836 +0000 UTC m=+71.914587900" watchObservedRunningTime="2026-04-16 04:25:15.929550555 +0000 UTC m=+71.915000619" Apr 16 04:25:33.842189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.842071 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-zwp94"] Apr 16 04:25:33.846103 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.846085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.848675 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.848638 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh"] Apr 16 04:25:33.849484 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.849458 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 04:25:33.849629 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.849544 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.850027 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.850010 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.850290 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.850277 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 04:25:33.850386 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.850368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-zps5l\"" Apr 16 04:25:33.851511 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.851494 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn"] Apr 16 04:25:33.851661 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.851646 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" Apr 16 04:25:33.854284 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.854268 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg"] Apr 16 04:25:33.854435 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.854417 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:33.855709 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.855689 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.855850 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.855809 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-stn4f\"" Apr 16 04:25:33.856159 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.856140 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.856942 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.856921 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:33.857041 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.856986 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-zwp94"] Apr 16 04:25:33.857599 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.857582 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fznvl\"" Apr 16 04:25:33.857693 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.857617 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 04:25:33.858069 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.858054 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.858356 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.858339 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.859180 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.859161 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 04:25:33.859646 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.859631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wxltq\"" Apr 16 04:25:33.859719 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.859651 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.859821 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.859806 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.860298 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.860283 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 04:25:33.860891 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.860872 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 04:25:33.864630 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.864606 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg"] Apr 16 04:25:33.865684 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.865661 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh"] Apr 16 04:25:33.881843 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.881798 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn"] Apr 16 04:25:33.941048 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.941018 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6"] Apr 16 04:25:33.943922 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.943906 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:33.946764 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.946742 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 04:25:33.947432 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.947406 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-c5qdh\"" Apr 16 04:25:33.947740 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.947716 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 04:25:33.948053 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.948033 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:33.948547 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.948296 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 04:25:33.951991 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.951968 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6"] Apr 16 04:25:33.971590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnc47\" (UniqueName: \"kubernetes.io/projected/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-kube-api-access-jnc47\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:33.971743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-snapshots\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.971743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk7p\" (UniqueName: \"kubernetes.io/projected/82c18366-78a3-498d-b2b0-202d98470a14-kube-api-access-gkk7p\") pod \"volume-data-source-validator-7d955d5dd4-crmkh\" (UID: \"82c18366-78a3-498d-b2b0-202d98470a14\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" Apr 16 04:25:33.971743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65296c0d-211b-4c4b-8926-070aad0da721-serving-cert\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.971863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvw2\" (UniqueName: \"kubernetes.io/projected/65296c0d-211b-4c4b-8926-070aad0da721-kube-api-access-9dvw2\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.971863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971776 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:33.971863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:33.971863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971811 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7xd\" (UniqueName: \"kubernetes.io/projected/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-kube-api-access-gr7xd\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:33.971863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.972005 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:33.972005 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:33.972005 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:33.971961 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-tmp\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073027 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.072987 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7xd\" (UniqueName: \"kubernetes.io/projected/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-kube-api-access-gr7xd\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.073027 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk7p\" (UniqueName: \"kubernetes.io/projected/82c18366-78a3-498d-b2b0-202d98470a14-kube-api-access-gkk7p\") pod \"volume-data-source-validator-7d955d5dd4-crmkh\" (UID: \"82c18366-78a3-498d-b2b0-202d98470a14\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073051 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvw2\" (UniqueName: \"kubernetes.io/projected/65296c0d-211b-4c4b-8926-070aad0da721-kube-api-access-9dvw2\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-snapshots\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073165 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnc47\" (UniqueName: \"kubernetes.io/projected/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-kube-api-access-jnc47\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073202 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e1800-0249-49d9-8db4-2138bd8e9201-serving-cert\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65296c0d-211b-4c4b-8926-070aad0da721-serving-cert\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073271 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.073257 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.073348 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls podName:c3f63c11-e5a4-44be-8b41-8c17d99cbeb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:34.573325943 +0000 UTC m=+90.558776007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls") pod "cluster-samples-operator-667775844f-6lqpg" (UID: "c3f63c11-e5a4-44be-8b41-8c17d99cbeb5") : secret "samples-operator-tls" not found Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e1800-0249-49d9-8db4-2138bd8e9201-config\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073467 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073489 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.073516 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhvf\" (UniqueName: \"kubernetes.io/projected/3f7e1800-0249-49d9-8db4-2138bd8e9201-kube-api-access-cbhvf\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.073590 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:25:34.573572111 +0000 UTC m=+90.559022152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:34.073650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.073635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-tmp\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.074072 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.074047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.074114 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.074050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.074334 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.074315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-snapshots\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.074370 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.074334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65296c0d-211b-4c4b-8926-070aad0da721-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.074370 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.074317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65296c0d-211b-4c4b-8926-070aad0da721-tmp\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.075724 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.075697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65296c0d-211b-4c4b-8926-070aad0da721-serving-cert\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.083812 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.083785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7xd\" (UniqueName: \"kubernetes.io/projected/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-kube-api-access-gr7xd\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.083974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.083956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvw2\" (UniqueName: \"kubernetes.io/projected/65296c0d-211b-4c4b-8926-070aad0da721-kube-api-access-9dvw2\") pod \"insights-operator-5785d4fcdd-zwp94\" (UID: \"65296c0d-211b-4c4b-8926-070aad0da721\") " pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.083974 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.083967 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnc47\" (UniqueName: \"kubernetes.io/projected/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-kube-api-access-jnc47\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:34.084092 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.083958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk7p\" (UniqueName: \"kubernetes.io/projected/82c18366-78a3-498d-b2b0-202d98470a14-kube-api-access-gkk7p\") pod \"volume-data-source-validator-7d955d5dd4-crmkh\" (UID: \"82c18366-78a3-498d-b2b0-202d98470a14\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" Apr 16 04:25:34.158147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.158053 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" Apr 16 04:25:34.165861 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.165820 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" Apr 16 04:25:34.174800 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.174724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e1800-0249-49d9-8db4-2138bd8e9201-serving-cert\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.174800 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.174779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e1800-0249-49d9-8db4-2138bd8e9201-config\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.174933 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.174801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhvf\" (UniqueName: \"kubernetes.io/projected/3f7e1800-0249-49d9-8db4-2138bd8e9201-kube-api-access-cbhvf\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.175458 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.175431 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f7e1800-0249-49d9-8db4-2138bd8e9201-config\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.176935 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.176916 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7e1800-0249-49d9-8db4-2138bd8e9201-serving-cert\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.182720 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.182673 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhvf\" (UniqueName: \"kubernetes.io/projected/3f7e1800-0249-49d9-8db4-2138bd8e9201-kube-api-access-cbhvf\") pod \"service-ca-operator-69965bb79d-kshw6\" (UID: \"3f7e1800-0249-49d9-8db4-2138bd8e9201\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.257289 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.257206 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" Apr 16 04:25:34.282526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.282466 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-zwp94"] Apr 16 04:25:34.286879 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:34.286850 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65296c0d_211b_4c4b_8926_070aad0da721.slice/crio-dd8b8ab44fcd110726b1fc4b76331d3864a355209beeef94163260780836e3d4 WatchSource:0}: Error finding container dd8b8ab44fcd110726b1fc4b76331d3864a355209beeef94163260780836e3d4: Status 404 returned error can't find the container with id dd8b8ab44fcd110726b1fc4b76331d3864a355209beeef94163260780836e3d4 Apr 16 04:25:34.299214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.299184 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh"] Apr 16 04:25:34.303078 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:34.303035 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c18366_78a3_498d_b2b0_202d98470a14.slice/crio-a91690a7f27f3c440e2cfee0e6c9ae332169816bbca600de23e0489a33260404 WatchSource:0}: Error finding container a91690a7f27f3c440e2cfee0e6c9ae332169816bbca600de23e0489a33260404: Status 404 returned error can't find the container with id a91690a7f27f3c440e2cfee0e6c9ae332169816bbca600de23e0489a33260404 Apr 16 04:25:34.382923 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.382892 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6"] Apr 16 04:25:34.386584 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:34.386555 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7e1800_0249_49d9_8db4_2138bd8e9201.slice/crio-4637472b48d1f6f3878589c41532d453d721ebf9188a4f7f6dfab6a15a3aeb8c WatchSource:0}: Error finding container 4637472b48d1f6f3878589c41532d453d721ebf9188a4f7f6dfab6a15a3aeb8c: Status 404 returned error can't find the container with id 4637472b48d1f6f3878589c41532d453d721ebf9188a4f7f6dfab6a15a3aeb8c Apr 16 04:25:34.577790 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.577751 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:34.577996 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.577847 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:34.577996 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.577919 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 04:25:34.577996 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.577936 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:34.577996 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.577987 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:25:35.57797258 +0000 UTC m=+91.563422621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:34.577996 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:34.578001 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls podName:c3f63c11-e5a4-44be-8b41-8c17d99cbeb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:35.577993414 +0000 UTC m=+91.563443456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls") pod "cluster-samples-operator-667775844f-6lqpg" (UID: "c3f63c11-e5a4-44be-8b41-8c17d99cbeb5") : secret "samples-operator-tls" not found Apr 16 04:25:34.943562 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.943458 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" event={"ID":"3f7e1800-0249-49d9-8db4-2138bd8e9201","Type":"ContainerStarted","Data":"4637472b48d1f6f3878589c41532d453d721ebf9188a4f7f6dfab6a15a3aeb8c"} Apr 16 04:25:34.944819 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.944783 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" event={"ID":"82c18366-78a3-498d-b2b0-202d98470a14","Type":"ContainerStarted","Data":"a91690a7f27f3c440e2cfee0e6c9ae332169816bbca600de23e0489a33260404"} Apr 16 04:25:34.945981 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:34.945956 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" event={"ID":"65296c0d-211b-4c4b-8926-070aad0da721","Type":"ContainerStarted","Data":"dd8b8ab44fcd110726b1fc4b76331d3864a355209beeef94163260780836e3d4"} Apr 16 04:25:35.587731 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:35.587697 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:35.587961 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:35.587800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:35.587961 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:35.587880 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 04:25:35.587961 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:35.587940 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:35.588174 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:35.587974 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls podName:c3f63c11-e5a4-44be-8b41-8c17d99cbeb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:37.587937305 +0000 UTC m=+93.573387348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls") pod "cluster-samples-operator-667775844f-6lqpg" (UID: "c3f63c11-e5a4-44be-8b41-8c17d99cbeb5") : secret "samples-operator-tls" not found Apr 16 04:25:35.588174 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:35.588004 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:25:37.587989397 +0000 UTC m=+93.573439453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:35.949517 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:35.949451 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" event={"ID":"82c18366-78a3-498d-b2b0-202d98470a14","Type":"ContainerStarted","Data":"a20b9e7cdde8ae34621a59397ffb1c0b7418ac3024c69732954771577f1849fc"} Apr 16 04:25:35.967684 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:35.964853 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-crmkh" podStartSLOduration=1.401246541 podStartE2EDuration="2.964814499s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:34.305081625 +0000 UTC m=+90.290531681" lastFinishedPulling="2026-04-16 04:25:35.868649582 +0000 UTC m=+91.854099639" observedRunningTime="2026-04-16 04:25:35.963220807 +0000 UTC m=+91.948670884" watchObservedRunningTime="2026-04-16 04:25:35.964814499 +0000 UTC m=+91.950264563" Apr 16 04:25:37.604423 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.604382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:37.604897 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.604470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:37.604897 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:37.604530 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:37.604897 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:37.604593 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.604577911 +0000 UTC m=+97.590027953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:37.604897 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:37.604592 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 04:25:37.604897 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:37.604620 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls podName:c3f63c11-e5a4-44be-8b41-8c17d99cbeb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:41.604614543 +0000 UTC m=+97.590064585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls") pod "cluster-samples-operator-667775844f-6lqpg" (UID: "c3f63c11-e5a4-44be-8b41-8c17d99cbeb5") : secret "samples-operator-tls" not found Apr 16 04:25:37.822511 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.822474 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5"] Apr 16 04:25:37.825871 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.825854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" Apr 16 04:25:37.828553 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.828529 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 04:25:37.829821 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.829789 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 04:25:37.829821 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.829810 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-dgwl4\"" Apr 16 04:25:37.833864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.833843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5"] Apr 16 04:25:37.905817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.905734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6hk\" (UniqueName: \"kubernetes.io/projected/25544907-fe6f-4ea8-bb7d-71f79b123309-kube-api-access-hw6hk\") pod \"migrator-64d4d94569-dzsx5\" (UID: \"25544907-fe6f-4ea8-bb7d-71f79b123309\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" Apr 16 04:25:37.957284 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.957245 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" event={"ID":"3f7e1800-0249-49d9-8db4-2138bd8e9201","Type":"ContainerStarted","Data":"a075877246791dfd787faa93bbbdb21f9ea23834cd7892ab42678205eae03449"} Apr 16 04:25:37.958527 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.958502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" event={"ID":"65296c0d-211b-4c4b-8926-070aad0da721","Type":"ContainerStarted","Data":"a42cdfe8223e883d6f488e6d676feff6c89077bdf3f4341e86ac7f257863ef82"} Apr 16 04:25:37.974191 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.974144 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" podStartSLOduration=2.433669648 podStartE2EDuration="4.974131182s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:34.388378934 +0000 UTC m=+90.373828976" lastFinishedPulling="2026-04-16 04:25:36.928840468 +0000 UTC m=+92.914290510" observedRunningTime="2026-04-16 04:25:37.973800662 +0000 UTC m=+93.959250727" watchObservedRunningTime="2026-04-16 04:25:37.974131182 +0000 UTC m=+93.959581246" Apr 16 04:25:37.992211 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:37.992161 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" podStartSLOduration=2.355043393 podStartE2EDuration="4.992147325s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:34.288479459 +0000 UTC m=+90.273929501" lastFinishedPulling="2026-04-16 04:25:36.925583386 +0000 UTC m=+92.911033433" observedRunningTime="2026-04-16 04:25:37.991446252 +0000 UTC m=+93.976896318" watchObservedRunningTime="2026-04-16 04:25:37.992147325 +0000 UTC m=+93.977597397" Apr 16 04:25:38.007171 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:38.007137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6hk\" (UniqueName: \"kubernetes.io/projected/25544907-fe6f-4ea8-bb7d-71f79b123309-kube-api-access-hw6hk\") pod \"migrator-64d4d94569-dzsx5\" (UID: \"25544907-fe6f-4ea8-bb7d-71f79b123309\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" Apr 16 04:25:38.018106 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:38.018076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6hk\" (UniqueName: \"kubernetes.io/projected/25544907-fe6f-4ea8-bb7d-71f79b123309-kube-api-access-hw6hk\") pod \"migrator-64d4d94569-dzsx5\" (UID: \"25544907-fe6f-4ea8-bb7d-71f79b123309\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" Apr 16 04:25:38.135704 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:38.135660 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" Apr 16 04:25:38.248845 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:38.248794 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5"] Apr 16 04:25:38.252284 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:38.252256 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25544907_fe6f_4ea8_bb7d_71f79b123309.slice/crio-5d01c8d09589bb147a4a78db28e54dec1066f085decb953d63a156923dc152cd WatchSource:0}: Error finding container 5d01c8d09589bb147a4a78db28e54dec1066f085decb953d63a156923dc152cd: Status 404 returned error can't find the container with id 5d01c8d09589bb147a4a78db28e54dec1066f085decb953d63a156923dc152cd Apr 16 04:25:38.965314 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:38.965271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" event={"ID":"25544907-fe6f-4ea8-bb7d-71f79b123309","Type":"ContainerStarted","Data":"5d01c8d09589bb147a4a78db28e54dec1066f085decb953d63a156923dc152cd"} Apr 16 04:25:39.450102 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.450071 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2"] Apr 16 04:25:39.453471 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.453448 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" Apr 16 04:25:39.456036 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.456015 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-j5lhv\"" Apr 16 04:25:39.460010 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.459970 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2"] Apr 16 04:25:39.518867 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.518815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvxl7\" (UniqueName: \"kubernetes.io/projected/86031e3b-eb84-42df-b1e9-35baad25d181-kube-api-access-gvxl7\") pod \"network-check-source-7b678d77c7-qnjs2\" (UID: \"86031e3b-eb84-42df-b1e9-35baad25d181\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" Apr 16 04:25:39.620242 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.620210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvxl7\" (UniqueName: \"kubernetes.io/projected/86031e3b-eb84-42df-b1e9-35baad25d181-kube-api-access-gvxl7\") pod \"network-check-source-7b678d77c7-qnjs2\" (UID: \"86031e3b-eb84-42df-b1e9-35baad25d181\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" Apr 16 04:25:39.628346 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.628319 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvxl7\" (UniqueName: \"kubernetes.io/projected/86031e3b-eb84-42df-b1e9-35baad25d181-kube-api-access-gvxl7\") pod \"network-check-source-7b678d77c7-qnjs2\" (UID: \"86031e3b-eb84-42df-b1e9-35baad25d181\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" Apr 16 04:25:39.765543 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.765518 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" Apr 16 04:25:39.881742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.881708 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2"] Apr 16 04:25:39.884582 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:25:39.884553 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86031e3b_eb84_42df_b1e9_35baad25d181.slice/crio-c83a2f8551eb16b3a31d0ee3a33cbef3dedbea145af93cca8e077d6e6423d365 WatchSource:0}: Error finding container c83a2f8551eb16b3a31d0ee3a33cbef3dedbea145af93cca8e077d6e6423d365: Status 404 returned error can't find the container with id c83a2f8551eb16b3a31d0ee3a33cbef3dedbea145af93cca8e077d6e6423d365 Apr 16 04:25:39.970757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.970668 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" event={"ID":"25544907-fe6f-4ea8-bb7d-71f79b123309","Type":"ContainerStarted","Data":"50f3e0e34a430d088d1c6717aa61f99f6f050ffe7cc1e3411c71e33adb95cce8"} Apr 16 04:25:39.970757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.970713 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" event={"ID":"25544907-fe6f-4ea8-bb7d-71f79b123309","Type":"ContainerStarted","Data":"be914c6e28c586a57e0f27ce54c8380324a4e2754384722ed1501b6dfb0315e8"} Apr 16 04:25:39.972069 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.972042 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" event={"ID":"86031e3b-eb84-42df-b1e9-35baad25d181","Type":"ContainerStarted","Data":"2a3f2b2040f484fb034944fa3c768f7078f52bcac39323c16ca9ccf4db24e78a"} Apr 16 04:25:39.972163 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.972076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" event={"ID":"86031e3b-eb84-42df-b1e9-35baad25d181","Type":"ContainerStarted","Data":"c83a2f8551eb16b3a31d0ee3a33cbef3dedbea145af93cca8e077d6e6423d365"} Apr 16 04:25:39.989761 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:39.989706 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-dzsx5" podStartSLOduration=1.598952362 podStartE2EDuration="2.989690119s" podCreationTimestamp="2026-04-16 04:25:37 +0000 UTC" firstStartedPulling="2026-04-16 04:25:38.254583803 +0000 UTC m=+94.240033850" lastFinishedPulling="2026-04-16 04:25:39.645321561 +0000 UTC m=+95.630771607" observedRunningTime="2026-04-16 04:25:39.988790429 +0000 UTC m=+95.974240496" watchObservedRunningTime="2026-04-16 04:25:39.989690119 +0000 UTC m=+95.975140182" Apr 16 04:25:40.004680 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:40.004630 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-qnjs2" podStartSLOduration=1.004616875 podStartE2EDuration="1.004616875s" podCreationTimestamp="2026-04-16 04:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:25:40.004178914 +0000 UTC m=+95.989628980" watchObservedRunningTime="2026-04-16 04:25:40.004616875 +0000 UTC m=+95.990066938" Apr 16 04:25:41.232386 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.232339 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") pod \"image-registry-589df4bd79-gpnbv\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:25:41.232731 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.232447 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 04:25:41.232731 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.232458 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589df4bd79-gpnbv: secret "image-registry-tls" not found Apr 16 04:25:41.232731 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.232531 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls podName:cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd nodeName:}" failed. No retries permitted until 2026-04-16 04:26:45.232515789 +0000 UTC m=+161.217965831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls") pod "image-registry-589df4bd79-gpnbv" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd") : secret "image-registry-tls" not found Apr 16 04:25:41.333270 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.333238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:25:41.333431 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.333294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:25:41.333431 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.333372 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 04:25:41.333431 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.333402 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 04:25:41.333532 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.333437 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls podName:390e05c5-2dbf-454b-872e-6a8969a124ae nodeName:}" failed. No retries permitted until 2026-04-16 04:26:45.333421548 +0000 UTC m=+161.318871590 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls") pod "dns-default-m5jm4" (UID: "390e05c5-2dbf-454b-872e-6a8969a124ae") : secret "dns-default-metrics-tls" not found Apr 16 04:25:41.333532 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.333450 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert podName:c19b5e4b-3455-4e1c-b332-ba6c51fb153b nodeName:}" failed. No retries permitted until 2026-04-16 04:26:45.333444559 +0000 UTC m=+161.318894601 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert") pod "ingress-canary-6jqxs" (UID: "c19b5e4b-3455-4e1c-b332-ba6c51fb153b") : secret "canary-serving-cert" not found Apr 16 04:25:41.341309 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.341284 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7pjb8_c35154e3-77e2-4f41-b5e8-99905ca385f9/dns-node-resolver/0.log" Apr 16 04:25:41.635677 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.635594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:41.635677 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:41.635658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:41.635880 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.635739 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:41.635880 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.635741 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 04:25:41.635880 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.635793 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.635779138 +0000 UTC m=+105.621229180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:41.635880 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:41.635806 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls podName:c3f63c11-e5a4-44be-8b41-8c17d99cbeb5 nodeName:}" failed. No retries permitted until 2026-04-16 04:25:49.635798997 +0000 UTC m=+105.621249039 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls") pod "cluster-samples-operator-667775844f-6lqpg" (UID: "c3f63c11-e5a4-44be-8b41-8c17d99cbeb5") : secret "samples-operator-tls" not found Apr 16 04:25:42.741154 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:42.741125 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tns7g_f41140f0-dc31-4907-aae1-f8d108bb517f/node-ca/0.log" Apr 16 04:25:43.741364 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:43.741338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-dzsx5_25544907-fe6f-4ea8-bb7d-71f79b123309/migrator/0.log" Apr 16 04:25:43.942550 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:43.942522 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-dzsx5_25544907-fe6f-4ea8-bb7d-71f79b123309/graceful-termination/0.log" Apr 16 04:25:46.906817 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:46.906783 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b2zb5" Apr 16 04:25:49.710975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:49.710928 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:25:49.711366 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:49.711027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:49.711366 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:49.711086 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:49.711366 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:25:49.711164 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls podName:7cfe1750-0b4e-43ab-b858-92eb84a5bd2a nodeName:}" failed. No retries permitted until 2026-04-16 04:26:05.711148721 +0000 UTC m=+121.696598762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-chgzn" (UID: "7cfe1750-0b4e-43ab-b858-92eb84a5bd2a") : secret "cluster-monitoring-operator-tls" not found Apr 16 04:25:49.713891 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:49.713861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3f63c11-e5a4-44be-8b41-8c17d99cbeb5-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6lqpg\" (UID: \"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:49.778729 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:49.778692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" Apr 16 04:25:49.899794 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:49.899761 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg"] Apr 16 04:25:50.001408 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:50.001369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" event={"ID":"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5","Type":"ContainerStarted","Data":"641a674e671aa55a0474a4abed92c225f73f75b87cbfc0c0b681cced34ff6490"} Apr 16 04:25:52.008590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:52.008505 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" event={"ID":"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5","Type":"ContainerStarted","Data":"61dc08dc615493a8beab0bafe698103a8c42acd1476fc7f41b712129e4990778"} Apr 16 04:25:52.008590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:52.008540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" event={"ID":"c3f63c11-e5a4-44be-8b41-8c17d99cbeb5","Type":"ContainerStarted","Data":"9e5343ee51fd217af41ef8b4401882847b1f5fba046fb6da01b4871aa4f95332"} Apr 16 04:25:52.028987 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:25:52.028932 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6lqpg" podStartSLOduration=17.220577785 podStartE2EDuration="19.028916688s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:25:49.951937887 +0000 UTC m=+105.937387929" lastFinishedPulling="2026-04-16 04:25:51.760276569 +0000 UTC m=+107.745726832" observedRunningTime="2026-04-16 04:25:52.027594807 +0000 UTC m=+108.013044872" watchObservedRunningTime="2026-04-16 04:25:52.028916688 +0000 UTC m=+108.014366752" Apr 16 04:26:01.155016 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.154983 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz"] Apr 16 04:26:01.158097 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.158074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.162083 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.162057 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xkn6v\"" Apr 16 04:26:01.162215 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.162096 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 04:26:01.162215 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.162057 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 04:26:01.167387 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.167363 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz"] Apr 16 04:26:01.208288 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.208253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.208451 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.208312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.240768 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.240731 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8kmvb"] Apr 16 04:26:01.243977 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.243954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.246651 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.246626 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-stqlj\"" Apr 16 04:26:01.247168 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.247149 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 04:26:01.247285 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.247152 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 04:26:01.257314 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.257275 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8kmvb"] Apr 16 04:26:01.309434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e36f31c-8145-464a-af78-558a0e3d7c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.309434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309436 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.309757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309469 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.309757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309536 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thhlj\" (UniqueName: \"kubernetes.io/projected/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-api-access-thhlj\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.309757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.309757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309704 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e36f31c-8145-464a-af78-558a0e3d7c33-crio-socket\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.310000 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.309815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e36f31c-8145-464a-af78-558a0e3d7c33-data-volume\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.310186 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.310167 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.312174 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.312151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cd8f751f-9b6f-4484-bebc-72c3ef2e887a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-ltqzz\" (UID: \"cd8f751f-9b6f-4484-bebc-72c3ef2e887a\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.410863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e36f31c-8145-464a-af78-558a0e3d7c33-crio-socket\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.410863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e36f31c-8145-464a-af78-558a0e3d7c33-data-volume\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410887 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4e36f31c-8145-464a-af78-558a0e3d7c33-crio-socket\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e36f31c-8145-464a-af78-558a0e3d7c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411094 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.410979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thhlj\" (UniqueName: \"kubernetes.io/projected/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-api-access-thhlj\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411294 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.411164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e36f31c-8145-464a-af78-558a0e3d7c33-data-volume\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.411396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.411377 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.413223 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.413198 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4e36f31c-8145-464a-af78-558a0e3d7c33-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.419361 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.419330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thhlj\" (UniqueName: \"kubernetes.io/projected/4e36f31c-8145-464a-af78-558a0e3d7c33-kube-api-access-thhlj\") pod \"insights-runtime-extractor-8kmvb\" (UID: \"4e36f31c-8145-464a-af78-558a0e3d7c33\") " pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.467713 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.467678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" Apr 16 04:26:01.554225 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.554188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8kmvb" Apr 16 04:26:01.585311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.585274 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz"] Apr 16 04:26:01.588714 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:01.588683 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8f751f_9b6f_4484_bebc_72c3ef2e887a.slice/crio-e297a6d7a3e5fc39ab7e100ecd8f04fe9b74841b88ba13c892e3d6b0cfb063c1 WatchSource:0}: Error finding container e297a6d7a3e5fc39ab7e100ecd8f04fe9b74841b88ba13c892e3d6b0cfb063c1: Status 404 returned error can't find the container with id e297a6d7a3e5fc39ab7e100ecd8f04fe9b74841b88ba13c892e3d6b0cfb063c1 Apr 16 04:26:01.677945 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:01.677867 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8kmvb"] Apr 16 04:26:01.681129 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:01.681098 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e36f31c_8145_464a_af78_558a0e3d7c33.slice/crio-bd56e031075cd43a919c9139367f8bd15917214ad9f7795cb2e80fcc4fe0e01c WatchSource:0}: Error finding container bd56e031075cd43a919c9139367f8bd15917214ad9f7795cb2e80fcc4fe0e01c: Status 404 returned error can't find the container with id bd56e031075cd43a919c9139367f8bd15917214ad9f7795cb2e80fcc4fe0e01c Apr 16 04:26:02.036252 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:02.036210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kmvb" event={"ID":"4e36f31c-8145-464a-af78-558a0e3d7c33","Type":"ContainerStarted","Data":"079128fadf4e0942fa4c094b8047e06d288ee0657d18c8ad1ff9385a77c6c063"} Apr 16 04:26:02.036252 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:02.036253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kmvb" event={"ID":"4e36f31c-8145-464a-af78-558a0e3d7c33","Type":"ContainerStarted","Data":"bd56e031075cd43a919c9139367f8bd15917214ad9f7795cb2e80fcc4fe0e01c"} Apr 16 04:26:02.037102 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:02.037083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" event={"ID":"cd8f751f-9b6f-4484-bebc-72c3ef2e887a","Type":"ContainerStarted","Data":"e297a6d7a3e5fc39ab7e100ecd8f04fe9b74841b88ba13c892e3d6b0cfb063c1"} Apr 16 04:26:03.042723 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:03.042680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kmvb" event={"ID":"4e36f31c-8145-464a-af78-558a0e3d7c33","Type":"ContainerStarted","Data":"72f9783a4fcded631be2039d5844174a91ddd30d2c9730e0766c923cf7b29816"} Apr 16 04:26:03.044208 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:03.044180 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" event={"ID":"cd8f751f-9b6f-4484-bebc-72c3ef2e887a","Type":"ContainerStarted","Data":"8ecadb28e8934efc0a64e263f0c69a98ab18f9f659179ea9a9db653f92131137"} Apr 16 04:26:03.059736 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:03.059680 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-ltqzz" podStartSLOduration=0.737520189 podStartE2EDuration="2.059634997s" podCreationTimestamp="2026-04-16 04:26:01 +0000 UTC" firstStartedPulling="2026-04-16 04:26:01.590485361 +0000 UTC m=+117.575935403" lastFinishedPulling="2026-04-16 04:26:02.912600155 +0000 UTC m=+118.898050211" observedRunningTime="2026-04-16 04:26:03.058733492 +0000 UTC m=+119.044183556" watchObservedRunningTime="2026-04-16 04:26:03.059634997 +0000 UTC m=+119.045085060" Apr 16 04:26:04.049111 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:04.049076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8kmvb" event={"ID":"4e36f31c-8145-464a-af78-558a0e3d7c33","Type":"ContainerStarted","Data":"dfcfe5f8fcdd391becf1a6b5a3df7bfc6ccfc1000a8a82b6c072d7dd05bdd319"} Apr 16 04:26:04.065531 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:04.065476 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8kmvb" podStartSLOduration=0.8324771 podStartE2EDuration="3.065460525s" podCreationTimestamp="2026-04-16 04:26:01 +0000 UTC" firstStartedPulling="2026-04-16 04:26:01.735933062 +0000 UTC m=+117.721383108" lastFinishedPulling="2026-04-16 04:26:03.968916492 +0000 UTC m=+119.954366533" observedRunningTime="2026-04-16 04:26:04.064383894 +0000 UTC m=+120.049833957" watchObservedRunningTime="2026-04-16 04:26:04.065460525 +0000 UTC m=+120.050910589" Apr 16 04:26:05.747380 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:05.747343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:26:05.749700 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:05.749676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cfe1750-0b4e-43ab-b858-92eb84a5bd2a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-chgzn\" (UID: \"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:26:05.975492 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:05.975462 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fznvl\"" Apr 16 04:26:05.983638 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:05.983609 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" Apr 16 04:26:06.102422 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:06.102393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn"] Apr 16 04:26:06.105370 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:06.105338 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfe1750_0b4e_43ab_b858_92eb84a5bd2a.slice/crio-d3e6d897c5423202a86bbeaafc4725d009112b753f2efab8ce1d945850d42e9f WatchSource:0}: Error finding container d3e6d897c5423202a86bbeaafc4725d009112b753f2efab8ce1d945850d42e9f: Status 404 returned error can't find the container with id d3e6d897c5423202a86bbeaafc4725d009112b753f2efab8ce1d945850d42e9f Apr 16 04:26:07.058365 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:07.058326 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" event={"ID":"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a","Type":"ContainerStarted","Data":"d3e6d897c5423202a86bbeaafc4725d009112b753f2efab8ce1d945850d42e9f"} Apr 16 04:26:08.062755 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.062721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" event={"ID":"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a","Type":"ContainerStarted","Data":"3d3c800696ba2d0708321b9f1e9ac9e0ea60d75404b27acd02e03f8000ae64df"} Apr 16 04:26:08.079150 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.079091 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" podStartSLOduration=33.280393115 podStartE2EDuration="35.079077061s" podCreationTimestamp="2026-04-16 04:25:33 +0000 UTC" firstStartedPulling="2026-04-16 04:26:06.107199465 +0000 UTC m=+122.092649508" lastFinishedPulling="2026-04-16 04:26:07.905883409 +0000 UTC m=+123.891333454" observedRunningTime="2026-04-16 04:26:08.078457932 +0000 UTC m=+124.063907996" watchObservedRunningTime="2026-04-16 04:26:08.079077061 +0000 UTC m=+124.064527162" Apr 16 04:26:08.381315 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.381280 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc"] Apr 16 04:26:08.384348 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.384330 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:08.386911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.386882 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 04:26:08.387068 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.386920 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-x7kl5\"" Apr 16 04:26:08.393139 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.393109 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc"] Apr 16 04:26:08.468591 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.468549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kf7rc\" (UID: \"928211b1-3493-4f55-82b3-0bd62a77bf41\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:08.570036 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:08.569996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kf7rc\" (UID: \"928211b1-3493-4f55-82b3-0bd62a77bf41\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:08.570179 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:08.570137 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 04:26:08.570217 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:08.570201 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates podName:928211b1-3493-4f55-82b3-0bd62a77bf41 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:09.07018577 +0000 UTC m=+125.055635812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-kf7rc" (UID: "928211b1-3493-4f55-82b3-0bd62a77bf41") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 04:26:09.074128 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:09.074091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kf7rc\" (UID: \"928211b1-3493-4f55-82b3-0bd62a77bf41\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:09.076656 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:09.076634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/928211b1-3493-4f55-82b3-0bd62a77bf41-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kf7rc\" (UID: \"928211b1-3493-4f55-82b3-0bd62a77bf41\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:09.293648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:09.293595 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:09.412247 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:09.412213 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc"] Apr 16 04:26:09.415239 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:09.415212 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928211b1_3493_4f55_82b3_0bd62a77bf41.slice/crio-7d02d968f91c8437f84b479bf84ea3a13112c597db5c398c4cfb5ffb45ff3278 WatchSource:0}: Error finding container 7d02d968f91c8437f84b479bf84ea3a13112c597db5c398c4cfb5ffb45ff3278: Status 404 returned error can't find the container with id 7d02d968f91c8437f84b479bf84ea3a13112c597db5c398c4cfb5ffb45ff3278 Apr 16 04:26:10.069351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:10.069311 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" event={"ID":"928211b1-3493-4f55-82b3-0bd62a77bf41","Type":"ContainerStarted","Data":"7d02d968f91c8437f84b479bf84ea3a13112c597db5c398c4cfb5ffb45ff3278"} Apr 16 04:26:11.073150 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.073113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" event={"ID":"928211b1-3493-4f55-82b3-0bd62a77bf41","Type":"ContainerStarted","Data":"cbeef621b459c994dcaeee8ca3907d967a8e6f5c81f8df84b29f7edf2bb7d0c5"} Apr 16 04:26:11.073560 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.073367 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:11.077896 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.077872 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" Apr 16 04:26:11.087513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.087459 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kf7rc" podStartSLOduration=1.601939848 podStartE2EDuration="3.087443571s" podCreationTimestamp="2026-04-16 04:26:08 +0000 UTC" firstStartedPulling="2026-04-16 04:26:09.417112324 +0000 UTC m=+125.402562370" lastFinishedPulling="2026-04-16 04:26:10.902616051 +0000 UTC m=+126.888066093" observedRunningTime="2026-04-16 04:26:11.086466697 +0000 UTC m=+127.071916760" watchObservedRunningTime="2026-04-16 04:26:11.087443571 +0000 UTC m=+127.072893635" Apr 16 04:26:11.439035 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.438998 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-9pphv"] Apr 16 04:26:11.461098 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.461066 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-9pphv"] Apr 16 04:26:11.461263 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.461233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.464048 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.464024 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 04:26:11.465421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.465398 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-4mt66\"" Apr 16 04:26:11.465527 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.465401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 04:26:11.465590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.465579 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 04:26:11.493305 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.493268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.493305 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.493308 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed3956a3-f976-4384-814d-557aff30f00d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.493516 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.493330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4tf\" (UniqueName: \"kubernetes.io/projected/ed3956a3-f976-4384-814d-557aff30f00d-kube-api-access-tn4tf\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.493516 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.493442 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.594031 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.593993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.594205 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.594099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.594205 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.594121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed3956a3-f976-4384-814d-557aff30f00d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.594205 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:11.594144 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 04:26:11.594318 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:11.594233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls podName:ed3956a3-f976-4384-814d-557aff30f00d nodeName:}" failed. No retries permitted until 2026-04-16 04:26:12.094204118 +0000 UTC m=+128.079654175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-9pphv" (UID: "ed3956a3-f976-4384-814d-557aff30f00d") : secret "prometheus-operator-tls" not found Apr 16 04:26:11.594318 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.594149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4tf\" (UniqueName: \"kubernetes.io/projected/ed3956a3-f976-4384-814d-557aff30f00d-kube-api-access-tn4tf\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.606222 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.606179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.606371 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.606296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed3956a3-f976-4384-814d-557aff30f00d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:11.608195 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:11.608177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4tf\" (UniqueName: \"kubernetes.io/projected/ed3956a3-f976-4384-814d-557aff30f00d-kube-api-access-tn4tf\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:12.099774 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:12.099737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:12.102230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:12.102209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed3956a3-f976-4384-814d-557aff30f00d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-9pphv\" (UID: \"ed3956a3-f976-4384-814d-557aff30f00d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:12.370739 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:12.370640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" Apr 16 04:26:12.491859 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:12.491793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-9pphv"] Apr 16 04:26:12.497213 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:12.497185 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded3956a3_f976_4384_814d_557aff30f00d.slice/crio-59db72c6cc038f7686bcf2fb1bee720fbceac2b613a455020dcb2ebd1990ee42 WatchSource:0}: Error finding container 59db72c6cc038f7686bcf2fb1bee720fbceac2b613a455020dcb2ebd1990ee42: Status 404 returned error can't find the container with id 59db72c6cc038f7686bcf2fb1bee720fbceac2b613a455020dcb2ebd1990ee42 Apr 16 04:26:13.079590 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:13.079555 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" event={"ID":"ed3956a3-f976-4384-814d-557aff30f00d","Type":"ContainerStarted","Data":"59db72c6cc038f7686bcf2fb1bee720fbceac2b613a455020dcb2ebd1990ee42"} Apr 16 04:26:14.085169 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.085110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" event={"ID":"ed3956a3-f976-4384-814d-557aff30f00d","Type":"ContainerStarted","Data":"3c9f76312476f37874748a43455c13a2f12d1911208fe90871fd2dfdc40552d5"} Apr 16 04:26:14.085169 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.085155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" event={"ID":"ed3956a3-f976-4384-814d-557aff30f00d","Type":"ContainerStarted","Data":"9fd357f1d62eef64bb82d935219c1d24198445a77e6b58c39cbcf054295a52bc"} Apr 16 04:26:14.101726 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.101654 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-9pphv" podStartSLOduration=1.7011217859999999 podStartE2EDuration="3.101616617s" podCreationTimestamp="2026-04-16 04:26:11 +0000 UTC" firstStartedPulling="2026-04-16 04:26:12.499156429 +0000 UTC m=+128.484606471" lastFinishedPulling="2026-04-16 04:26:13.89965126 +0000 UTC m=+129.885101302" observedRunningTime="2026-04-16 04:26:14.10085528 +0000 UTC m=+130.086305344" watchObservedRunningTime="2026-04-16 04:26:14.101616617 +0000 UTC m=+130.087066681" Apr 16 04:26:14.318415 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.318379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:26:14.320670 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.320639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/638d6e19-46c9-4d63-a7b2-461e842da022-metrics-certs\") pod \"network-metrics-daemon-j6hlh\" (UID: \"638d6e19-46c9-4d63-a7b2-461e842da022\") " pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:26:14.387461 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.387429 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xfxbt\"" Apr 16 04:26:14.395763 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.395735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6hlh" Apr 16 04:26:14.510813 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:14.510765 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6hlh"] Apr 16 04:26:14.513402 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:14.513369 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod638d6e19_46c9_4d63_a7b2_461e842da022.slice/crio-efdc45ab9f1fdc5617e845508186b7075ba126a50e1a0034e6787d577bdf939e WatchSource:0}: Error finding container efdc45ab9f1fdc5617e845508186b7075ba126a50e1a0034e6787d577bdf939e: Status 404 returned error can't find the container with id efdc45ab9f1fdc5617e845508186b7075ba126a50e1a0034e6787d577bdf939e Apr 16 04:26:15.089515 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:15.089476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6hlh" event={"ID":"638d6e19-46c9-4d63-a7b2-461e842da022","Type":"ContainerStarted","Data":"efdc45ab9f1fdc5617e845508186b7075ba126a50e1a0034e6787d577bdf939e"} Apr 16 04:26:16.093535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.093495 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6hlh" event={"ID":"638d6e19-46c9-4d63-a7b2-461e842da022","Type":"ContainerStarted","Data":"2fa15f605d33e9ea6b93dcf7e149756ced56a67de8841d6acefa450a51fe7bb8"} Apr 16 04:26:16.093535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.093538 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6hlh" event={"ID":"638d6e19-46c9-4d63-a7b2-461e842da022","Type":"ContainerStarted","Data":"be3520af1aee09025088c3f7b6b0ab22590afde236363eaf0147dc8eb624da7a"} Apr 16 04:26:16.094807 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.094784 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:26:16.094933 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.094844 2567 generic.go:358] "Generic (PLEG): container finished" podID="7cfe1750-0b4e-43ab-b858-92eb84a5bd2a" containerID="3d3c800696ba2d0708321b9f1e9ac9e0ea60d75404b27acd02e03f8000ae64df" exitCode=2 Apr 16 04:26:16.094933 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.094915 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" event={"ID":"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a","Type":"ContainerDied","Data":"3d3c800696ba2d0708321b9f1e9ac9e0ea60d75404b27acd02e03f8000ae64df"} Apr 16 04:26:16.095181 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.095169 2567 scope.go:117] "RemoveContainer" containerID="3d3c800696ba2d0708321b9f1e9ac9e0ea60d75404b27acd02e03f8000ae64df" Apr 16 04:26:16.112192 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:16.112147 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j6hlh" podStartSLOduration=131.163995736 podStartE2EDuration="2m12.112113425s" podCreationTimestamp="2026-04-16 04:24:04 +0000 UTC" firstStartedPulling="2026-04-16 04:26:14.515210401 +0000 UTC m=+130.500660443" lastFinishedPulling="2026-04-16 04:26:15.463328087 +0000 UTC m=+131.448778132" observedRunningTime="2026-04-16 04:26:16.111803112 +0000 UTC m=+132.097253171" watchObservedRunningTime="2026-04-16 04:26:16.112113425 +0000 UTC m=+132.097563487" Apr 16 04:26:17.099497 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:17.099465 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:26:17.099988 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:17.099587 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-chgzn" event={"ID":"7cfe1750-0b4e-43ab-b858-92eb84a5bd2a","Type":"ContainerStarted","Data":"b2e99e6c50a40be5e7a5afaeaf4f556963201e9bed14559961adad332edc7773"} Apr 16 04:26:19.754717 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.754674 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-gf878"] Apr 16 04:26:19.758295 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.758270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.761039 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.761000 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 04:26:19.761174 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.761092 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 04:26:19.762394 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.762365 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 04:26:19.762510 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.762415 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-m7vdr\"" Apr 16 04:26:19.764804 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.764780 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rgpmw"] Apr 16 04:26:19.768269 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.768247 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-gf878"] Apr 16 04:26:19.768373 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.768295 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.770701 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.770677 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 04:26:19.770808 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.770727 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s26sx\"" Apr 16 04:26:19.770808 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.770751 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 04:26:19.770808 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.770727 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 04:26:19.866276 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866234 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/845d2afa-de51-4c2f-95f1-6416de335031-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866276 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsf4\" (UniqueName: \"kubernetes.io/projected/845d2afa-de51-4c2f-95f1-6416de335031-kube-api-access-wfsf4\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-wtmp\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866444 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-accelerators-collector-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-root\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-sys\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866722 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.866919 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-textfile\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866919 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-metrics-client-ca\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.866919 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.866787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zwx9\" (UniqueName: \"kubernetes.io/projected/2c02d360-9b4f-44d2-802b-44aa2d3ca611-kube-api-access-9zwx9\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968150 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsf4\" (UniqueName: \"kubernetes.io/projected/845d2afa-de51-4c2f-95f1-6416de335031-kube-api-access-wfsf4\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.968150 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-wtmp\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-accelerators-collector-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:19.968285 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-root\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968396 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968372 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-wtmp\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-root\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:19.968380 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls podName:2c02d360-9b4f-44d2-802b-44aa2d3ca611 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:20.468347521 +0000 UTC m=+136.453797570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls") pod "node-exporter-rgpmw" (UID: "2c02d360-9b4f-44d2-802b-44aa2d3ca611") : secret "node-exporter-tls" not found Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968526 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968570 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968611 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-sys\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.968660 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:19.968686 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-textfile\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-metrics-client-ca\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968745 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zwx9\" (UniqueName: \"kubernetes.io/projected/2c02d360-9b4f-44d2-802b-44aa2d3ca611-kube-api-access-9zwx9\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:19.968756 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls podName:845d2afa-de51-4c2f-95f1-6416de335031 nodeName:}" failed. No retries permitted until 2026-04-16 04:26:20.468739572 +0000 UTC m=+136.454189614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-gf878" (UID: "845d2afa-de51-4c2f-95f1-6416de335031") : secret "kube-state-metrics-tls" not found Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968904 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-accelerators-collector-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.968970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c02d360-9b4f-44d2-802b-44aa2d3ca611-sys\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969002 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/845d2afa-de51-4c2f-95f1-6416de335031-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.969443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969091 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.969443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969373 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/845d2afa-de51-4c2f-95f1-6416de335031-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.969443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/845d2afa-de51-4c2f-95f1-6416de335031-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.969443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c02d360-9b4f-44d2-802b-44aa2d3ca611-metrics-client-ca\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.969443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.969432 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-textfile\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.971159 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.971137 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:19.971216 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.971172 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.977760 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.977734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsf4\" (UniqueName: \"kubernetes.io/projected/845d2afa-de51-4c2f-95f1-6416de335031-kube-api-access-wfsf4\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:19.977981 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:19.977932 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zwx9\" (UniqueName: \"kubernetes.io/projected/2c02d360-9b4f-44d2-802b-44aa2d3ca611-kube-api-access-9zwx9\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:20.474918 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.474878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:20.475089 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.474971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:20.477560 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.477528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/845d2afa-de51-4c2f-95f1-6416de335031-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-gf878\" (UID: \"845d2afa-de51-4c2f-95f1-6416de335031\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:20.477708 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.477654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2c02d360-9b4f-44d2-802b-44aa2d3ca611-node-exporter-tls\") pod \"node-exporter-rgpmw\" (UID: \"2c02d360-9b4f-44d2-802b-44aa2d3ca611\") " pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:20.668971 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.668935 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" Apr 16 04:26:20.677255 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.677219 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rgpmw" Apr 16 04:26:20.687410 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:20.687378 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c02d360_9b4f_44d2_802b_44aa2d3ca611.slice/crio-35f4ce73a1925d29467d8785f6d7138791627ec944a85185773abd8076b433c9 WatchSource:0}: Error finding container 35f4ce73a1925d29467d8785f6d7138791627ec944a85185773abd8076b433c9: Status 404 returned error can't find the container with id 35f4ce73a1925d29467d8785f6d7138791627ec944a85185773abd8076b433c9 Apr 16 04:26:20.814976 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:20.814939 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-gf878"] Apr 16 04:26:20.818642 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:20.818606 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod845d2afa_de51_4c2f_95f1_6416de335031.slice/crio-ddb198c7a3768c554495d128577368b026af6d8f956aef690fa7770a8b3413de WatchSource:0}: Error finding container ddb198c7a3768c554495d128577368b026af6d8f956aef690fa7770a8b3413de: Status 404 returned error can't find the container with id ddb198c7a3768c554495d128577368b026af6d8f956aef690fa7770a8b3413de Apr 16 04:26:21.111530 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:21.111492 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rgpmw" event={"ID":"2c02d360-9b4f-44d2-802b-44aa2d3ca611","Type":"ContainerStarted","Data":"35f4ce73a1925d29467d8785f6d7138791627ec944a85185773abd8076b433c9"} Apr 16 04:26:21.112731 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:21.112703 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" event={"ID":"845d2afa-de51-4c2f-95f1-6416de335031","Type":"ContainerStarted","Data":"ddb198c7a3768c554495d128577368b026af6d8f956aef690fa7770a8b3413de"} Apr 16 04:26:22.117442 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:22.117408 2567 generic.go:358] "Generic (PLEG): container finished" podID="2c02d360-9b4f-44d2-802b-44aa2d3ca611" containerID="27973ead13b4b984ca478a15cd67c6d1c03ead50e0115dba1aed3f51cc325378" exitCode=0 Apr 16 04:26:22.117881 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:22.117504 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rgpmw" event={"ID":"2c02d360-9b4f-44d2-802b-44aa2d3ca611","Type":"ContainerDied","Data":"27973ead13b4b984ca478a15cd67c6d1c03ead50e0115dba1aed3f51cc325378"} Apr 16 04:26:23.122449 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.122413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rgpmw" event={"ID":"2c02d360-9b4f-44d2-802b-44aa2d3ca611","Type":"ContainerStarted","Data":"de61dabfc258aa26b6a2ec5f598876497716c457ce3be50411eb64fbdfb7a66f"} Apr 16 04:26:23.122449 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.122455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rgpmw" event={"ID":"2c02d360-9b4f-44d2-802b-44aa2d3ca611","Type":"ContainerStarted","Data":"8799e99fcc64fc3da70e111df2ddb9daebe6b5916c228c6977c195a8da789b4c"} Apr 16 04:26:23.124185 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.124162 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" event={"ID":"845d2afa-de51-4c2f-95f1-6416de335031","Type":"ContainerStarted","Data":"7dfe6605b8c7c6a0835bf35d30199c5cd8e579c5ff328529c1542c89c4038ee0"} Apr 16 04:26:23.124277 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.124189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" event={"ID":"845d2afa-de51-4c2f-95f1-6416de335031","Type":"ContainerStarted","Data":"21f44425cbd8696cb3c55f3e1ef8e0a2091288c1b30235da4d36f5b36d3d059d"} Apr 16 04:26:23.124277 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.124200 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" event={"ID":"845d2afa-de51-4c2f-95f1-6416de335031","Type":"ContainerStarted","Data":"a0feb290567c4ce4f4980092bcfecef9a0f1468105cbda56ccc04e70249a5b34"} Apr 16 04:26:23.144265 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.144215 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rgpmw" podStartSLOduration=3.3892404 podStartE2EDuration="4.144200232s" podCreationTimestamp="2026-04-16 04:26:19 +0000 UTC" firstStartedPulling="2026-04-16 04:26:20.68945865 +0000 UTC m=+136.674908693" lastFinishedPulling="2026-04-16 04:26:21.444418465 +0000 UTC m=+137.429868525" observedRunningTime="2026-04-16 04:26:23.142984221 +0000 UTC m=+139.128434299" watchObservedRunningTime="2026-04-16 04:26:23.144200232 +0000 UTC m=+139.129650296" Apr 16 04:26:23.295606 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.295550 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-gf878" podStartSLOduration=2.903389029 podStartE2EDuration="4.295534219s" podCreationTimestamp="2026-04-16 04:26:19 +0000 UTC" firstStartedPulling="2026-04-16 04:26:20.82093644 +0000 UTC m=+136.806386482" lastFinishedPulling="2026-04-16 04:26:22.21308163 +0000 UTC m=+138.198531672" observedRunningTime="2026-04-16 04:26:23.164230081 +0000 UTC m=+139.149680146" watchObservedRunningTime="2026-04-16 04:26:23.295534219 +0000 UTC m=+139.280984279" Apr 16 04:26:23.296670 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:23.296649 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-589df4bd79-gpnbv"] Apr 16 04:26:23.296912 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:23.296893 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" podUID="cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" Apr 16 04:26:24.127217 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.127179 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:26:24.131579 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.131554 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:26:24.208648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208618 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.208820 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208661 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.208820 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208695 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.208820 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208725 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.208820 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208766 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.209053 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208895 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:26:24.209053 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208906 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx76m\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.209053 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.208975 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration\") pod \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\" (UID: \"cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd\") " Apr 16 04:26:24.209214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.209066 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:26:24.209214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.209159 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:26:24.209498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.209459 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-ca-trust-extracted\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.209498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.209488 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-certificates\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.209498 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.209504 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-trusted-ca\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.211523 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.211496 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:26:24.211631 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.211591 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:26:24.211721 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.211698 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m" (OuterVolumeSpecName: "kube-api-access-sx76m") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "kube-api-access-sx76m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:26:24.212144 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.212128 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" (UID: "cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:26:24.310149 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.310113 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sx76m\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-kube-api-access-sx76m\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.310149 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.310142 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-image-registry-private-configuration\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.310149 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.310154 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-bound-sa-token\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.310371 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.310165 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-installation-pull-secrets\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:24.532688 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.532654 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb"] Apr 16 04:26:24.538566 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.538548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:24.541069 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.541046 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 04:26:24.541186 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.541155 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fj9hm\"" Apr 16 04:26:24.548505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.548471 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb"] Apr 16 04:26:24.612792 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.612761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/674ddf7c-c87b-4a2d-905f-ce7f5730ae71-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-458wb\" (UID: \"674ddf7c-c87b-4a2d-905f-ce7f5730ae71\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:24.713543 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.713512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/674ddf7c-c87b-4a2d-905f-ce7f5730ae71-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-458wb\" (UID: \"674ddf7c-c87b-4a2d-905f-ce7f5730ae71\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:24.715894 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.715866 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/674ddf7c-c87b-4a2d-905f-ce7f5730ae71-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-458wb\" (UID: \"674ddf7c-c87b-4a2d-905f-ce7f5730ae71\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:24.848008 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.847919 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:24.946781 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.946745 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d76444989-mlm9q"] Apr 16 04:26:24.953134 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.953113 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:24.956742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956518 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hv57c\"" Apr 16 04:26:24.956742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956532 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 04:26:24.956742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956587 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 04:26:24.957002 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956786 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d76444989-mlm9q"] Apr 16 04:26:24.957002 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956927 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 04:26:24.957002 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956960 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 04:26:24.957002 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.956963 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 04:26:24.960813 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.960789 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 04:26:24.971252 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:24.971209 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb"] Apr 16 04:26:24.977151 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:24.977119 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod674ddf7c_c87b_4a2d_905f_ce7f5730ae71.slice/crio-9ebcedaedc2cee53f1bf7b930e9b2d5346758c40b7bc6e74d0842b3e0cfc746e WatchSource:0}: Error finding container 9ebcedaedc2cee53f1bf7b930e9b2d5346758c40b7bc6e74d0842b3e0cfc746e: Status 404 returned error can't find the container with id 9ebcedaedc2cee53f1bf7b930e9b2d5346758c40b7bc6e74d0842b3e0cfc746e Apr 16 04:26:25.016024 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.015990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzx2p\" (UniqueName: \"kubernetes.io/projected/c906f038-e36e-466f-8ce1-62037b784089-kube-api-access-pzx2p\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016177 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-telemeter-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016177 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016095 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016177 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-metrics-client-ca\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016272 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016272 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016252 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-federate-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016339 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016286 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.016339 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.016306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-serving-certs-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.116749 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.116660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-serving-certs-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.116749 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.116699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzx2p\" (UniqueName: \"kubernetes.io/projected/c906f038-e36e-466f-8ce1-62037b784089-kube-api-access-pzx2p\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117010 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.116822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-telemeter-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117010 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.116910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117010 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.116944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-metrics-client-ca\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117224 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.117024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117224 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.117083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-federate-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117224 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.117117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117619 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.117584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-serving-certs-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.117736 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.117665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-metrics-client-ca\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.118742 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.118713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c906f038-e36e-466f-8ce1-62037b784089-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.119371 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.119351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-telemeter-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.119468 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.119395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-federate-client-tls\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.119532 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.119493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.119991 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.119959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c906f038-e36e-466f-8ce1-62037b784089-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.123769 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.123749 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzx2p\" (UniqueName: \"kubernetes.io/projected/c906f038-e36e-466f-8ce1-62037b784089-kube-api-access-pzx2p\") pod \"telemeter-client-5d76444989-mlm9q\" (UID: \"c906f038-e36e-466f-8ce1-62037b784089\") " pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.130583 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.130561 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" event={"ID":"674ddf7c-c87b-4a2d-905f-ce7f5730ae71","Type":"ContainerStarted","Data":"9ebcedaedc2cee53f1bf7b930e9b2d5346758c40b7bc6e74d0842b3e0cfc746e"} Apr 16 04:26:25.130888 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.130568 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589df4bd79-gpnbv" Apr 16 04:26:25.160374 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.160332 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-589df4bd79-gpnbv"] Apr 16 04:26:25.165466 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.165440 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-589df4bd79-gpnbv"] Apr 16 04:26:25.264686 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.264643 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" Apr 16 04:26:25.320189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.320155 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd-registry-tls\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:26:25.411727 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:25.408058 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d76444989-mlm9q"] Apr 16 04:26:25.414593 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:25.414565 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc906f038_e36e_466f_8ce1_62037b784089.slice/crio-7d582106cfec0904a3d5c9ba7d9e2efec7056c718a7b6918b7feb261e6d31f67 WatchSource:0}: Error finding container 7d582106cfec0904a3d5c9ba7d9e2efec7056c718a7b6918b7feb261e6d31f67: Status 404 returned error can't find the container with id 7d582106cfec0904a3d5c9ba7d9e2efec7056c718a7b6918b7feb261e6d31f67 Apr 16 04:26:26.015986 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.015909 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:26:26.020897 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.020865 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.028535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.028012 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5gt7vnkakqk52\"" Apr 16 04:26:26.028535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.028230 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8xqx8\"" Apr 16 04:26:26.028535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.028481 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.028703 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.028749 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029038 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029231 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029372 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029467 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 04:26:26.029855 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.029589 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 04:26:26.031676 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.031064 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 04:26:26.031676 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.031122 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 04:26:26.031676 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.031385 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 04:26:26.031942 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.031840 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 04:26:26.033445 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.033400 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126647 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126784 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126921 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127145 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.126993 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.127975 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.127225 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwqm\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.137211 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.137069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" event={"ID":"c906f038-e36e-466f-8ce1-62037b784089","Type":"ContainerStarted","Data":"7d582106cfec0904a3d5c9ba7d9e2efec7056c718a7b6918b7feb261e6d31f67"} Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwqm\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.227999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228069 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.228644 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.229667 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.228336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.229667 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.229520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.232648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.232040 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.234325 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.233095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.234325 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.233667 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.234325 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.234279 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.236944 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.236107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.237977 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.237927 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.239514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.239581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.240198 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.240315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.240681 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.240686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.240920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.241351 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.241140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.242162 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.241577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.242647 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.242624 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.242729 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.242689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwqm\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm\") pod \"prometheus-k8s-0\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.346633 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.346529 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:26.641183 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.641143 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:26:26.644380 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:26.644352 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49150035_8fd6_4d67_86db_5acab25e44fd.slice/crio-9b5da6228428038ed99525e352e611d164d67544aadbfe1f7dd7e91ccb05102e WatchSource:0}: Error finding container 9b5da6228428038ed99525e352e611d164d67544aadbfe1f7dd7e91ccb05102e: Status 404 returned error can't find the container with id 9b5da6228428038ed99525e352e611d164d67544aadbfe1f7dd7e91ccb05102e Apr 16 04:26:26.674083 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:26.674051 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd" path="/var/lib/kubelet/pods/cf4edcc6-ad3a-42fb-abb0-9c0ddf5b11fd/volumes" Apr 16 04:26:27.147578 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:27.147540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" event={"ID":"674ddf7c-c87b-4a2d-905f-ce7f5730ae71","Type":"ContainerStarted","Data":"bcdc00fad29bd7ba2cc06bd7c8c52f993afd324abcd45ca8554275c252e9ddfb"} Apr 16 04:26:27.148066 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:27.147935 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:27.149133 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:27.149106 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"9b5da6228428038ed99525e352e611d164d67544aadbfe1f7dd7e91ccb05102e"} Apr 16 04:26:27.153984 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:27.153962 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" Apr 16 04:26:27.163233 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:27.163177 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-458wb" podStartSLOduration=1.605876485 podStartE2EDuration="3.163160736s" podCreationTimestamp="2026-04-16 04:26:24 +0000 UTC" firstStartedPulling="2026-04-16 04:26:24.979075725 +0000 UTC m=+140.964525766" lastFinishedPulling="2026-04-16 04:26:26.53635997 +0000 UTC m=+142.521810017" observedRunningTime="2026-04-16 04:26:27.161713444 +0000 UTC m=+143.147163513" watchObservedRunningTime="2026-04-16 04:26:27.163160736 +0000 UTC m=+143.148610800" Apr 16 04:26:28.154248 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.154158 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="3ec0478eba7176deb36b2e16181b697279ce0df6147e0ed35c31b2fbc1746f4a" exitCode=0 Apr 16 04:26:28.154702 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.154240 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"3ec0478eba7176deb36b2e16181b697279ce0df6147e0ed35c31b2fbc1746f4a"} Apr 16 04:26:28.156248 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.156224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" event={"ID":"c906f038-e36e-466f-8ce1-62037b784089","Type":"ContainerStarted","Data":"14c371ef7e8f9c181dcb66244abc204a81b9ae6e0e529667bb5fc2d3be2c20ee"} Apr 16 04:26:28.156334 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.156256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" event={"ID":"c906f038-e36e-466f-8ce1-62037b784089","Type":"ContainerStarted","Data":"47689f47350dc9c342c5d87c97830820b3f9efb93b732a703ff0ab6f607b07d0"} Apr 16 04:26:28.156334 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.156266 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" event={"ID":"c906f038-e36e-466f-8ce1-62037b784089","Type":"ContainerStarted","Data":"fb5e07a754d4a871e24cdc606f31effdc3bfa416ab8e0932cc7ff79dba531116"} Apr 16 04:26:28.197221 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:28.197154 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d76444989-mlm9q" podStartSLOduration=1.8574637680000001 podStartE2EDuration="4.197132158s" podCreationTimestamp="2026-04-16 04:26:24 +0000 UTC" firstStartedPulling="2026-04-16 04:26:25.416427257 +0000 UTC m=+141.401877299" lastFinishedPulling="2026-04-16 04:26:27.756095634 +0000 UTC m=+143.741545689" observedRunningTime="2026-04-16 04:26:28.195754398 +0000 UTC m=+144.181204461" watchObservedRunningTime="2026-04-16 04:26:28.197132158 +0000 UTC m=+144.182582223" Apr 16 04:26:32.173887 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:32.173849 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"5e7a51f525a719a5696adfc79018a0b2b1769611c56ce7586e2c26757f4df3b1"} Apr 16 04:26:32.173887 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:32.173890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"874585e2b570724bde5a5a90591d9d0b24f91ba4bf36658864760c74c58c0d07"} Apr 16 04:26:34.183390 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:34.183350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"314b35c23916e466568d57753891c9bd9497b4c3c2cc7bbddd733e67789802b6"} Apr 16 04:26:34.183390 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:34.183389 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"15b47f7232bd0abc80aeb6dcd4c02116eeeb6d77c8b548b27f7f7a99d5073d4a"} Apr 16 04:26:34.183390 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:34.183398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"5d010beaaf827fb0f0b6b3dcf36a45e71372d50d3a053fb6aea05e55930a79cc"} Apr 16 04:26:34.184038 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:34.183407 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerStarted","Data":"f7da96b0897eeae8d30f813498ad151d0fc8cb40101ce8c7dbac8b15317dd57f"} Apr 16 04:26:34.207649 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:34.207578 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.692055518 podStartE2EDuration="9.207557821s" podCreationTimestamp="2026-04-16 04:26:25 +0000 UTC" firstStartedPulling="2026-04-16 04:26:26.646125596 +0000 UTC m=+142.631575653" lastFinishedPulling="2026-04-16 04:26:33.1616279 +0000 UTC m=+149.147077956" observedRunningTime="2026-04-16 04:26:34.206256707 +0000 UTC m=+150.191706770" watchObservedRunningTime="2026-04-16 04:26:34.207557821 +0000 UTC m=+150.193007886" Apr 16 04:26:36.347433 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:36.347391 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:26:40.395097 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:40.395035 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6jqxs" podUID="c19b5e4b-3455-4e1c-b332-ba6c51fb153b" Apr 16 04:26:40.409236 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:26:40.409207 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m5jm4" podUID="390e05c5-2dbf-454b-872e-6a8969a124ae" Apr 16 04:26:41.204876 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:41.204781 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:26:45.411169 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.411122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:26:45.411558 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.411288 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:26:45.413488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.413464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/390e05c5-2dbf-454b-872e-6a8969a124ae-metrics-tls\") pod \"dns-default-m5jm4\" (UID: \"390e05c5-2dbf-454b-872e-6a8969a124ae\") " pod="openshift-dns/dns-default-m5jm4" Apr 16 04:26:45.413605 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.413540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19b5e4b-3455-4e1c-b332-ba6c51fb153b-cert\") pod \"ingress-canary-6jqxs\" (UID: \"c19b5e4b-3455-4e1c-b332-ba6c51fb153b\") " pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:26:45.708303 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.708209 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-d4mbv\"" Apr 16 04:26:45.716574 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.716543 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jqxs" Apr 16 04:26:45.837689 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:45.837664 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jqxs"] Apr 16 04:26:45.840389 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:45.840359 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19b5e4b_3455_4e1c_b332_ba6c51fb153b.slice/crio-d6b2bd6a83aec58cf2bf120d587c4ab920f007b3fbea4c2ef6eb5effa633d893 WatchSource:0}: Error finding container d6b2bd6a83aec58cf2bf120d587c4ab920f007b3fbea4c2ef6eb5effa633d893: Status 404 returned error can't find the container with id d6b2bd6a83aec58cf2bf120d587c4ab920f007b3fbea4c2ef6eb5effa633d893 Apr 16 04:26:46.224740 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:46.224702 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jqxs" event={"ID":"c19b5e4b-3455-4e1c-b332-ba6c51fb153b","Type":"ContainerStarted","Data":"d6b2bd6a83aec58cf2bf120d587c4ab920f007b3fbea4c2ef6eb5effa633d893"} Apr 16 04:26:48.232795 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:48.232715 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jqxs" event={"ID":"c19b5e4b-3455-4e1c-b332-ba6c51fb153b","Type":"ContainerStarted","Data":"78e37c27564b6d6fd2049c78e884c5754cba2578b2103c441a8c1f8eea91a483"} Apr 16 04:26:48.247998 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:48.247945 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6jqxs" podStartSLOduration=129.577096205 podStartE2EDuration="2m11.24792995s" podCreationTimestamp="2026-04-16 04:24:37 +0000 UTC" firstStartedPulling="2026-04-16 04:26:45.842326633 +0000 UTC m=+161.827776675" lastFinishedPulling="2026-04-16 04:26:47.513160366 +0000 UTC m=+163.498610420" observedRunningTime="2026-04-16 04:26:48.247039975 +0000 UTC m=+164.232490039" watchObservedRunningTime="2026-04-16 04:26:48.24792995 +0000 UTC m=+164.233380013" Apr 16 04:26:53.669687 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:53.669644 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m5jm4" Apr 16 04:26:53.673264 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:53.673242 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-smz8d\"" Apr 16 04:26:53.680374 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:53.680343 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m5jm4" Apr 16 04:26:53.807910 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:53.807885 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m5jm4"] Apr 16 04:26:53.810392 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:26:53.810368 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390e05c5_2dbf_454b_872e_6a8969a124ae.slice/crio-e6612204d6c093a0bf324e26c66b6b8472252d498a116a7d5d84321eafb6fc46 WatchSource:0}: Error finding container e6612204d6c093a0bf324e26c66b6b8472252d498a116a7d5d84321eafb6fc46: Status 404 returned error can't find the container with id e6612204d6c093a0bf324e26c66b6b8472252d498a116a7d5d84321eafb6fc46 Apr 16 04:26:54.250864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:54.250811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5jm4" event={"ID":"390e05c5-2dbf-454b-872e-6a8969a124ae","Type":"ContainerStarted","Data":"e6612204d6c093a0bf324e26c66b6b8472252d498a116a7d5d84321eafb6fc46"} Apr 16 04:26:55.256421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:55.256381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5jm4" event={"ID":"390e05c5-2dbf-454b-872e-6a8969a124ae","Type":"ContainerStarted","Data":"81e77b77d270e78352a70ccdcf2099964d73452a04944d4f29584b7f8b18fb2a"} Apr 16 04:26:56.260439 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:56.260399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m5jm4" event={"ID":"390e05c5-2dbf-454b-872e-6a8969a124ae","Type":"ContainerStarted","Data":"01174fde8f228221a8db1abeb0cf139ab8dab3016f1666adb956a0f803232277"} Apr 16 04:26:56.260856 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:56.260516 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m5jm4" Apr 16 04:26:56.279108 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:26:56.279046 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m5jm4" podStartSLOduration=138.005599105 podStartE2EDuration="2m19.279027759s" podCreationTimestamp="2026-04-16 04:24:37 +0000 UTC" firstStartedPulling="2026-04-16 04:26:53.812296853 +0000 UTC m=+169.797746895" lastFinishedPulling="2026-04-16 04:26:55.085725493 +0000 UTC m=+171.071175549" observedRunningTime="2026-04-16 04:26:56.277616059 +0000 UTC m=+172.263066124" watchObservedRunningTime="2026-04-16 04:26:56.279027759 +0000 UTC m=+172.264477823" Apr 16 04:27:03.282019 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:03.281981 2567 generic.go:358] "Generic (PLEG): container finished" podID="3f7e1800-0249-49d9-8db4-2138bd8e9201" containerID="a075877246791dfd787faa93bbbdb21f9ea23834cd7892ab42678205eae03449" exitCode=0 Apr 16 04:27:03.282398 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:03.282054 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" event={"ID":"3f7e1800-0249-49d9-8db4-2138bd8e9201","Type":"ContainerDied","Data":"a075877246791dfd787faa93bbbdb21f9ea23834cd7892ab42678205eae03449"} Apr 16 04:27:03.282398 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:03.282376 2567 scope.go:117] "RemoveContainer" containerID="a075877246791dfd787faa93bbbdb21f9ea23834cd7892ab42678205eae03449" Apr 16 04:27:04.286878 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:04.286819 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kshw6" event={"ID":"3f7e1800-0249-49d9-8db4-2138bd8e9201","Type":"ContainerStarted","Data":"80e0fff9a6f073e0437ddde07b0e5d7076c182300bdfdbb76cac117a1f9e7bf8"} Apr 16 04:27:06.266171 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:06.266138 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m5jm4" Apr 16 04:27:13.314981 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:13.314942 2567 generic.go:358] "Generic (PLEG): container finished" podID="65296c0d-211b-4c4b-8926-070aad0da721" containerID="a42cdfe8223e883d6f488e6d676feff6c89077bdf3f4341e86ac7f257863ef82" exitCode=0 Apr 16 04:27:13.315366 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:13.315011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" event={"ID":"65296c0d-211b-4c4b-8926-070aad0da721","Type":"ContainerDied","Data":"a42cdfe8223e883d6f488e6d676feff6c89077bdf3f4341e86ac7f257863ef82"} Apr 16 04:27:13.315408 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:13.315387 2567 scope.go:117] "RemoveContainer" containerID="a42cdfe8223e883d6f488e6d676feff6c89077bdf3f4341e86ac7f257863ef82" Apr 16 04:27:14.320526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:14.320490 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-zwp94" event={"ID":"65296c0d-211b-4c4b-8926-070aad0da721","Type":"ContainerStarted","Data":"31c6c0480514b84e070d2fceab49cbb2e890f096fcbbfe9ca95db6002763767c"} Apr 16 04:27:26.347030 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:26.346978 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:26.366768 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:26.366741 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:26.381478 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:26.381452 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:44.267270 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.267225 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:44.267973 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.267912 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="prometheus" containerID="cri-o://874585e2b570724bde5a5a90591d9d0b24f91ba4bf36658864760c74c58c0d07" gracePeriod=600 Apr 16 04:27:44.268107 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.268074 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy" containerID="cri-o://15b47f7232bd0abc80aeb6dcd4c02116eeeb6d77c8b548b27f7f7a99d5073d4a" gracePeriod=600 Apr 16 04:27:44.268167 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.268126 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-web" containerID="cri-o://5d010beaaf827fb0f0b6b3dcf36a45e71372d50d3a053fb6aea05e55930a79cc" gracePeriod=600 Apr 16 04:27:44.268221 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.268192 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="thanos-sidecar" containerID="cri-o://f7da96b0897eeae8d30f813498ad151d0fc8cb40101ce8c7dbac8b15317dd57f" gracePeriod=600 Apr 16 04:27:44.268278 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.268217 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-thanos" containerID="cri-o://314b35c23916e466568d57753891c9bd9497b4c3c2cc7bbddd733e67789802b6" gracePeriod=600 Apr 16 04:27:44.268278 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.268244 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="config-reloader" containerID="cri-o://5e7a51f525a719a5696adfc79018a0b2b1769611c56ce7586e2c26757f4df3b1" gracePeriod=600 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415189 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="314b35c23916e466568d57753891c9bd9497b4c3c2cc7bbddd733e67789802b6" exitCode=0 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415213 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="15b47f7232bd0abc80aeb6dcd4c02116eeeb6d77c8b548b27f7f7a99d5073d4a" exitCode=0 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415220 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="5d010beaaf827fb0f0b6b3dcf36a45e71372d50d3a053fb6aea05e55930a79cc" exitCode=0 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415225 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="f7da96b0897eeae8d30f813498ad151d0fc8cb40101ce8c7dbac8b15317dd57f" exitCode=0 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415230 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="5e7a51f525a719a5696adfc79018a0b2b1769611c56ce7586e2c26757f4df3b1" exitCode=0 Apr 16 04:27:44.415220 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415235 2567 generic.go:358] "Generic (PLEG): container finished" podID="49150035-8fd6-4d67-86db-5acab25e44fd" containerID="874585e2b570724bde5a5a90591d9d0b24f91ba4bf36658864760c74c58c0d07" exitCode=0 Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415264 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"314b35c23916e466568d57753891c9bd9497b4c3c2cc7bbddd733e67789802b6"} Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415310 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"15b47f7232bd0abc80aeb6dcd4c02116eeeb6d77c8b548b27f7f7a99d5073d4a"} Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415323 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"5d010beaaf827fb0f0b6b3dcf36a45e71372d50d3a053fb6aea05e55930a79cc"} Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415334 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"f7da96b0897eeae8d30f813498ad151d0fc8cb40101ce8c7dbac8b15317dd57f"} Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"5e7a51f525a719a5696adfc79018a0b2b1769611c56ce7586e2c26757f4df3b1"} Apr 16 04:27:44.415561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.415360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"874585e2b570724bde5a5a90591d9d0b24f91ba4bf36658864760c74c58c0d07"} Apr 16 04:27:44.516485 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.516459 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:44.609043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.608947 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.608991 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609033 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609071 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609102 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609135 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609319 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609165 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609535 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609502 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:44.609632 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609787 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609771 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.609990 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609961 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:44.610088 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.609984 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.610207 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.610192 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.610329 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.610314 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.610507 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.610491 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611045 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611027 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611174 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611304 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611288 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611414 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611398 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611524 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611501 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcwqm\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm\") pod \"49150035-8fd6-4d67-86db-5acab25e44fd\" (UID: \"49150035-8fd6-4d67-86db-5acab25e44fd\") " Apr 16 04:27:44.611803 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611783 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.611911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611810 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.611911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.610523 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:44.611911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611207 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:27:44.611911 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.611897 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:44.612793 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.612753 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:27:44.613095 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613069 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:27:44.613714 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613183 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.613714 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613246 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.613714 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613325 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.613714 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613356 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.614021 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.613750 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.614188 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.614155 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm" (OuterVolumeSpecName: "kube-api-access-fcwqm") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "kube-api-access-fcwqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:27:44.614439 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.614405 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config" (OuterVolumeSpecName: "config") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.614750 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.614726 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out" (OuterVolumeSpecName: "config-out") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:27:44.614814 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.614755 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.614814 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.614766 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.624984 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.624945 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config" (OuterVolumeSpecName: "web-config") pod "49150035-8fd6-4d67-86db-5acab25e44fd" (UID: "49150035-8fd6-4d67-86db-5acab25e44fd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 04:27:44.713014 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.712972 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713014 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713009 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-grpc-tls\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713014 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713021 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-tls-assets\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713031 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-db\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713041 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713052 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-metrics-client-ca\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713060 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-metrics-client-certs\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713069 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49150035-8fd6-4d67-86db-5acab25e44fd-config-out\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713078 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-web-config\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713086 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713094 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49150035-8fd6-4d67-86db-5acab25e44fd-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713103 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-config\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713111 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcwqm\" (UniqueName: \"kubernetes.io/projected/49150035-8fd6-4d67-86db-5acab25e44fd-kube-api-access-fcwqm\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713119 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-kube-rbac-proxy\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713129 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:44.713251 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:44.713137 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/49150035-8fd6-4d67-86db-5acab25e44fd-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:27:45.420970 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.420933 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"49150035-8fd6-4d67-86db-5acab25e44fd","Type":"ContainerDied","Data":"9b5da6228428038ed99525e352e611d164d67544aadbfe1f7dd7e91ccb05102e"} Apr 16 04:27:45.421394 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.420987 2567 scope.go:117] "RemoveContainer" containerID="314b35c23916e466568d57753891c9bd9497b4c3c2cc7bbddd733e67789802b6" Apr 16 04:27:45.421394 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.421000 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.429705 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.429684 2567 scope.go:117] "RemoveContainer" containerID="15b47f7232bd0abc80aeb6dcd4c02116eeeb6d77c8b548b27f7f7a99d5073d4a" Apr 16 04:27:45.437753 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.437734 2567 scope.go:117] "RemoveContainer" containerID="5d010beaaf827fb0f0b6b3dcf36a45e71372d50d3a053fb6aea05e55930a79cc" Apr 16 04:27:45.442168 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.442141 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:45.446965 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.446932 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:45.449690 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.449665 2567 scope.go:117] "RemoveContainer" containerID="f7da96b0897eeae8d30f813498ad151d0fc8cb40101ce8c7dbac8b15317dd57f" Apr 16 04:27:45.457546 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.457524 2567 scope.go:117] "RemoveContainer" containerID="5e7a51f525a719a5696adfc79018a0b2b1769611c56ce7586e2c26757f4df3b1" Apr 16 04:27:45.465189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.465164 2567 scope.go:117] "RemoveContainer" containerID="874585e2b570724bde5a5a90591d9d0b24f91ba4bf36658864760c74c58c0d07" Apr 16 04:27:45.468683 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.468645 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:45.469016 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.468998 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="thanos-sidecar" Apr 16 04:27:45.469016 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469016 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="thanos-sidecar" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469026 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="init-config-reloader" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469032 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="init-config-reloader" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469039 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-web" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469045 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-web" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469058 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="config-reloader" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469066 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="config-reloader" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469075 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469083 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469094 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-thanos" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469102 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-thanos" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469124 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="prometheus" Apr 16 04:27:45.469136 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469131 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="prometheus" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469199 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-web" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469213 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="thanos-sidecar" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469222 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="prometheus" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469228 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy-thanos" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469235 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="kube-rbac-proxy" Apr 16 04:27:45.469521 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.469241 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" containerName="config-reloader" Apr 16 04:27:45.474037 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.474011 2567 scope.go:117] "RemoveContainer" containerID="3ec0478eba7176deb36b2e16181b697279ce0df6147e0ed35c31b2fbc1746f4a" Apr 16 04:27:45.474261 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.474245 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.477147 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477121 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 04:27:45.477287 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477169 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 04:27:45.477433 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477415 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 04:27:45.477501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477466 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 04:27:45.477588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477567 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 04:27:45.477711 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477642 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5gt7vnkakqk52\"" Apr 16 04:27:45.477711 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477706 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 04:27:45.477845 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477780 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 04:27:45.477901 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.477855 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 04:27:45.478028 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.478013 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8xqx8\"" Apr 16 04:27:45.478127 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.478090 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 04:27:45.478186 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.478178 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 04:27:45.478357 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.478335 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 04:27:45.480606 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.480587 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 04:27:45.484774 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.484645 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 04:27:45.486029 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.486007 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:45.520096 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520096 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520223 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6n8v\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-kube-api-access-n6n8v\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520300 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520339 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520461 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.520505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.520489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621472 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621472 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621472 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621526 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621625 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621670 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.621715 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6n8v\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-kube-api-access-n6n8v\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621787 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.621954 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.622068 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.622513 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.622297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.623517 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.622697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.623517 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.623428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.624208 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.624181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.625233 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.625154 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.625233 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.625164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.625756 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.625699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.625875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.625769 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b55b096a-6b44-4567-8071-2f7bb51e4c6a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.626259 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.626178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.626259 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.626186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.626424 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.626256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.626492 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.626477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.627156 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.627129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b55b096a-6b44-4567-8071-2f7bb51e4c6a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.627452 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.627432 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.627893 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.627874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.628209 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.628186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b55b096a-6b44-4567-8071-2f7bb51e4c6a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.630522 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.630498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6n8v\" (UniqueName: \"kubernetes.io/projected/b55b096a-6b44-4567-8071-2f7bb51e4c6a-kube-api-access-n6n8v\") pod \"prometheus-k8s-0\" (UID: \"b55b096a-6b44-4567-8071-2f7bb51e4c6a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.786734 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.786677 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:27:45.945899 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:45.945780 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 04:27:45.949901 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:27:45.949866 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55b096a_6b44_4567_8071_2f7bb51e4c6a.slice/crio-f3ccf0587168fba5b689798b021d15a737be94a3fe5bff03568c810a0bcac200 WatchSource:0}: Error finding container f3ccf0587168fba5b689798b021d15a737be94a3fe5bff03568c810a0bcac200: Status 404 returned error can't find the container with id f3ccf0587168fba5b689798b021d15a737be94a3fe5bff03568c810a0bcac200 Apr 16 04:27:46.424925 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:46.424893 2567 generic.go:358] "Generic (PLEG): container finished" podID="b55b096a-6b44-4567-8071-2f7bb51e4c6a" containerID="df8b25be95eee4361d37ffab1d75a57077ee7b9305ed66bccafdfdb90d6ecf72" exitCode=0 Apr 16 04:27:46.425362 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:46.424986 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerDied","Data":"df8b25be95eee4361d37ffab1d75a57077ee7b9305ed66bccafdfdb90d6ecf72"} Apr 16 04:27:46.425362 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:46.425022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"f3ccf0587168fba5b689798b021d15a737be94a3fe5bff03568c810a0bcac200"} Apr 16 04:27:46.686024 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:46.685987 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49150035-8fd6-4d67-86db-5acab25e44fd" path="/var/lib/kubelet/pods/49150035-8fd6-4d67-86db-5acab25e44fd/volumes" Apr 16 04:27:47.432847 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"b097fb35816272bd96363556dcc01d4354ba3cc1bc0da1e0f32c13edff51babd"} Apr 16 04:27:47.432847 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432850 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"c8bee4b1a84f78957b69a9974ca418193e21dcb0751dab4b91c6459c5d3685d3"} Apr 16 04:27:47.433337 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432865 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"6c68a4e7870f2b1a97eeda4eaef26579109dca4f0a340c32cd10e81267f026b8"} Apr 16 04:27:47.433337 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"6285103f1bcadf41abe49e5b285cc6ea8a8f256e60549fded85aff99e1f6c290"} Apr 16 04:27:47.433337 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432886 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"72a2f521c0f98319a83cb07ce660309ea88a040f1850093cc997a1bd52997ea2"} Apr 16 04:27:47.433337 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.432897 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b55b096a-6b44-4567-8071-2f7bb51e4c6a","Type":"ContainerStarted","Data":"2e87fdc6b3f62a055265e8a4c74df9b6dcbf7157d9e7e9c05e8018138874a191"} Apr 16 04:27:47.460549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:47.460480 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.460457651 podStartE2EDuration="2.460457651s" podCreationTimestamp="2026-04-16 04:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:27:47.457476981 +0000 UTC m=+223.442927046" watchObservedRunningTime="2026-04-16 04:27:47.460457651 +0000 UTC m=+223.445907716" Apr 16 04:27:50.786966 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:27:50.786912 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:28:45.787432 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:28:45.787387 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:28:45.802923 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:28:45.802896 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:28:46.628989 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:28:46.628962 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 04:29:03.432125 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.432091 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5gks4"] Apr 16 04:29:03.435256 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.435236 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.437935 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.437905 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 04:29:03.437935 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.437928 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 04:29:03.439182 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.439163 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-v96qx\"" Apr 16 04:29:03.442460 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.442381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5gks4"] Apr 16 04:29:03.593754 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.593716 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.593962 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.593767 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvsp\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-kube-api-access-7tvsp\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.694951 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.694858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.694951 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.694916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvsp\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-kube-api-access-7tvsp\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.704667 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.704634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.704798 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.704709 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvsp\" (UniqueName: \"kubernetes.io/projected/c6a78441-f2de-4ff8-b40a-d629627545a6-kube-api-access-7tvsp\") pod \"cert-manager-cainjector-8966b78d4-5gks4\" (UID: \"c6a78441-f2de-4ff8-b40a-d629627545a6\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.762906 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.762864 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" Apr 16 04:29:03.883556 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:03.883521 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-5gks4"] Apr 16 04:29:03.887128 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:29:03.887105 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6a78441_f2de_4ff8_b40a_d629627545a6.slice/crio-4ce91f1c57a502925479f6c8762cbcb7962c74fd18446a5d3a18de6d2995ccd6 WatchSource:0}: Error finding container 4ce91f1c57a502925479f6c8762cbcb7962c74fd18446a5d3a18de6d2995ccd6: Status 404 returned error can't find the container with id 4ce91f1c57a502925479f6c8762cbcb7962c74fd18446a5d3a18de6d2995ccd6 Apr 16 04:29:04.492981 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.492953 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:29:04.493394 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.493068 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:29:04.500068 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.500039 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:29:04.500348 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.500327 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:29:04.506502 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.506479 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 04:29:04.667800 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:04.667639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" event={"ID":"c6a78441-f2de-4ff8-b40a-d629627545a6","Type":"ContainerStarted","Data":"4ce91f1c57a502925479f6c8762cbcb7962c74fd18446a5d3a18de6d2995ccd6"} Apr 16 04:29:07.679228 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:07.679187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" event={"ID":"c6a78441-f2de-4ff8-b40a-d629627545a6","Type":"ContainerStarted","Data":"03b0936fa8c132c46264bda03f12aab351fdf21de4f32db142381a7867dab9cd"} Apr 16 04:29:07.693438 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:07.693384 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-5gks4" podStartSLOduration=1.457844701 podStartE2EDuration="4.693368719s" podCreationTimestamp="2026-04-16 04:29:03 +0000 UTC" firstStartedPulling="2026-04-16 04:29:03.8894388 +0000 UTC m=+299.874888842" lastFinishedPulling="2026-04-16 04:29:07.124962818 +0000 UTC m=+303.110412860" observedRunningTime="2026-04-16 04:29:07.692947051 +0000 UTC m=+303.678397127" watchObservedRunningTime="2026-04-16 04:29:07.693368719 +0000 UTC m=+303.678818783" Apr 16 04:29:18.879562 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.879525 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5"] Apr 16 04:29:18.882771 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.882751 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:18.885705 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.885668 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-5264b\"" Apr 16 04:29:18.885878 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.885745 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 04:29:18.885878 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.885745 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 04:29:18.889015 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.888988 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5"] Apr 16 04:29:18.921042 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.921001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748g4\" (UniqueName: \"kubernetes.io/projected/5493e96d-031c-4086-835b-81957ad31956-kube-api-access-748g4\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:18.921223 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:18.921072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5493e96d-031c-4086-835b-81957ad31956-tmp\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.021506 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.021455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5493e96d-031c-4086-835b-81957ad31956-tmp\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.021684 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.021532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-748g4\" (UniqueName: \"kubernetes.io/projected/5493e96d-031c-4086-835b-81957ad31956-kube-api-access-748g4\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.021917 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.021895 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5493e96d-031c-4086-835b-81957ad31956-tmp\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.029228 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.029192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-748g4\" (UniqueName: \"kubernetes.io/projected/5493e96d-031c-4086-835b-81957ad31956-kube-api-access-748g4\") pod \"openshift-lws-operator-bfc7f696d-bs8d5\" (UID: \"5493e96d-031c-4086-835b-81957ad31956\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.194331 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.194229 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" Apr 16 04:29:19.319653 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.319622 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5"] Apr 16 04:29:19.322614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:29:19.322581 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5493e96d_031c_4086_835b_81957ad31956.slice/crio-ea7f93a8f1fb76206c8e072b47f89075f81ec81aec498f5f2174a720a60cc1e0 WatchSource:0}: Error finding container ea7f93a8f1fb76206c8e072b47f89075f81ec81aec498f5f2174a720a60cc1e0: Status 404 returned error can't find the container with id ea7f93a8f1fb76206c8e072b47f89075f81ec81aec498f5f2174a720a60cc1e0 Apr 16 04:29:19.324529 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.324510 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:29:19.713863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:19.713807 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" event={"ID":"5493e96d-031c-4086-835b-81957ad31956","Type":"ContainerStarted","Data":"ea7f93a8f1fb76206c8e072b47f89075f81ec81aec498f5f2174a720a60cc1e0"} Apr 16 04:29:22.723506 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:22.723472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" event={"ID":"5493e96d-031c-4086-835b-81957ad31956","Type":"ContainerStarted","Data":"483f62ac42b1bbaddf6f2bacd0780eed338ac9012c3eee2a59888389043fc7f6"} Apr 16 04:29:22.739531 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:22.739482 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-bs8d5" podStartSLOduration=2.308758378 podStartE2EDuration="4.739465559s" podCreationTimestamp="2026-04-16 04:29:18 +0000 UTC" firstStartedPulling="2026-04-16 04:29:19.324635047 +0000 UTC m=+315.310085089" lastFinishedPulling="2026-04-16 04:29:21.755342227 +0000 UTC m=+317.740792270" observedRunningTime="2026-04-16 04:29:22.738218602 +0000 UTC m=+318.723668666" watchObservedRunningTime="2026-04-16 04:29:22.739465559 +0000 UTC m=+318.724915624" Apr 16 04:29:39.560256 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.560224 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2"] Apr 16 04:29:39.565526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.565508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.568409 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.568382 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 04:29:39.568409 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.568407 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 04:29:39.568594 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.568448 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zzbzl\"" Apr 16 04:29:39.568594 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.568387 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 04:29:39.568594 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.568382 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 04:29:39.575115 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.575092 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2"] Apr 16 04:29:39.701633 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.701594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkl84\" (UniqueName: \"kubernetes.io/projected/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-kube-api-access-mkl84\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.701863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.701674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.701863 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.701721 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.802459 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.802424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkl84\" (UniqueName: \"kubernetes.io/projected/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-kube-api-access-mkl84\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.802614 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.802483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.802614 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.802594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.805113 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.805081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-apiservice-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.805231 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.805081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-webhook-cert\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.814161 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.814104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkl84\" (UniqueName: \"kubernetes.io/projected/2d4adf34-12c0-4e76-b7e2-e6c39c7e7355-kube-api-access-mkl84\") pod \"opendatahub-operator-controller-manager-c7946b447-pjqx2\" (UID: \"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355\") " pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:39.876871 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:39.876776 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:40.007008 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:40.006979 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2"] Apr 16 04:29:40.009673 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:29:40.009641 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4adf34_12c0_4e76_b7e2_e6c39c7e7355.slice/crio-f001c4b3f1ab957b7732f754854fff3bf056641fbe660d80ec71d1262136ca4c WatchSource:0}: Error finding container f001c4b3f1ab957b7732f754854fff3bf056641fbe660d80ec71d1262136ca4c: Status 404 returned error can't find the container with id f001c4b3f1ab957b7732f754854fff3bf056641fbe660d80ec71d1262136ca4c Apr 16 04:29:40.787335 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:40.787268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" event={"ID":"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355","Type":"ContainerStarted","Data":"f001c4b3f1ab957b7732f754854fff3bf056641fbe660d80ec71d1262136ca4c"} Apr 16 04:29:42.801905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:42.801867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" event={"ID":"2d4adf34-12c0-4e76-b7e2-e6c39c7e7355","Type":"ContainerStarted","Data":"9c43f7414aa7a04f276845d1ecdb13249545e091297b1ec30438eb825a6388fe"} Apr 16 04:29:42.802294 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:42.801990 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:42.824438 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:42.824380 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" podStartSLOduration=1.135936461 podStartE2EDuration="3.824362923s" podCreationTimestamp="2026-04-16 04:29:39 +0000 UTC" firstStartedPulling="2026-04-16 04:29:40.01153664 +0000 UTC m=+335.996986682" lastFinishedPulling="2026-04-16 04:29:42.699963088 +0000 UTC m=+338.685413144" observedRunningTime="2026-04-16 04:29:42.822166558 +0000 UTC m=+338.807616621" watchObservedRunningTime="2026-04-16 04:29:42.824362923 +0000 UTC m=+338.809812986" Apr 16 04:29:53.807303 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:53.807223 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-c7946b447-pjqx2" Apr 16 04:29:57.099505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.099463 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-m6stc"] Apr 16 04:29:57.107095 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.107070 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.111182 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.111152 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 04:29:57.111488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.111455 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 04:29:57.111488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.111483 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 04:29:57.111656 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.111158 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-5rvrm\"" Apr 16 04:29:57.111733 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.111715 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 04:29:57.113073 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.113046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-m6stc"] Apr 16 04:29:57.262190 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.262151 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv4pm\" (UniqueName: \"kubernetes.io/projected/a9d9ae6f-6579-4770-98c2-5083dadcb564-kube-api-access-jv4pm\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.262350 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.262202 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9d9ae6f-6579-4770-98c2-5083dadcb564-tmp\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.262350 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.262310 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d9ae6f-6579-4770-98c2-5083dadcb564-tls-certs\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.363455 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.363350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d9ae6f-6579-4770-98c2-5083dadcb564-tls-certs\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.363455 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.363412 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv4pm\" (UniqueName: \"kubernetes.io/projected/a9d9ae6f-6579-4770-98c2-5083dadcb564-kube-api-access-jv4pm\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.363455 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.363448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9d9ae6f-6579-4770-98c2-5083dadcb564-tmp\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.365685 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.365648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9d9ae6f-6579-4770-98c2-5083dadcb564-tmp\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.365897 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.365878 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9d9ae6f-6579-4770-98c2-5083dadcb564-tls-certs\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.374102 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.374080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv4pm\" (UniqueName: \"kubernetes.io/projected/a9d9ae6f-6579-4770-98c2-5083dadcb564-kube-api-access-jv4pm\") pod \"kube-auth-proxy-68d6cc647-m6stc\" (UID: \"a9d9ae6f-6579-4770-98c2-5083dadcb564\") " pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.418170 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.418135 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" Apr 16 04:29:57.541686 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.541636 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-68d6cc647-m6stc"] Apr 16 04:29:57.544144 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:29:57.544114 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d9ae6f_6579_4770_98c2_5083dadcb564.slice/crio-43d934053c5746396e6b76701f6697e56a908bee7c93a3335c6abcf67937414e WatchSource:0}: Error finding container 43d934053c5746396e6b76701f6697e56a908bee7c93a3335c6abcf67937414e: Status 404 returned error can't find the container with id 43d934053c5746396e6b76701f6697e56a908bee7c93a3335c6abcf67937414e Apr 16 04:29:57.855606 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:29:57.855573 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" event={"ID":"a9d9ae6f-6579-4770-98c2-5083dadcb564","Type":"ContainerStarted","Data":"43d934053c5746396e6b76701f6697e56a908bee7c93a3335c6abcf67937414e"} Apr 16 04:30:00.267561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.267530 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7bxmc"] Apr 16 04:30:00.271314 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.271293 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.273735 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.273708 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-9q6gf\"" Apr 16 04:30:00.273880 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.273707 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 04:30:00.277501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.277449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7bxmc"] Apr 16 04:30:00.390681 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.390641 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.390898 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.390723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbxh\" (UniqueName: \"kubernetes.io/projected/a850ebad-0876-49e7-a7cf-5b05d5b500ec-kube-api-access-wkbxh\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.491588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.491549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.491778 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.491627 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbxh\" (UniqueName: \"kubernetes.io/projected/a850ebad-0876-49e7-a7cf-5b05d5b500ec-kube-api-access-wkbxh\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.491778 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:00.491736 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 04:30:00.491938 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:00.491845 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert podName:a850ebad-0876-49e7-a7cf-5b05d5b500ec nodeName:}" failed. No retries permitted until 2026-04-16 04:30:00.991807264 +0000 UTC m=+356.977257307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert") pod "odh-model-controller-858dbf95b8-7bxmc" (UID: "a850ebad-0876-49e7-a7cf-5b05d5b500ec") : secret "odh-model-controller-webhook-cert" not found Apr 16 04:30:00.500537 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.500505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbxh\" (UniqueName: \"kubernetes.io/projected/a850ebad-0876-49e7-a7cf-5b05d5b500ec-kube-api-access-wkbxh\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.996890 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:00.996860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:00.997032 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:00.996992 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 04:30:00.997072 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:00.997054 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert podName:a850ebad-0876-49e7-a7cf-5b05d5b500ec nodeName:}" failed. No retries permitted until 2026-04-16 04:30:01.997037234 +0000 UTC m=+357.982487277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert") pod "odh-model-controller-858dbf95b8-7bxmc" (UID: "a850ebad-0876-49e7-a7cf-5b05d5b500ec") : secret "odh-model-controller-webhook-cert" not found Apr 16 04:30:01.870973 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:01.870932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" event={"ID":"a9d9ae6f-6579-4770-98c2-5083dadcb564","Type":"ContainerStarted","Data":"601066609a794d27c82f2c0cc822743dc5c42b442e050629c359699997f6171e"} Apr 16 04:30:01.886739 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:01.886687 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-68d6cc647-m6stc" podStartSLOduration=1.492864969 podStartE2EDuration="4.886666771s" podCreationTimestamp="2026-04-16 04:29:57 +0000 UTC" firstStartedPulling="2026-04-16 04:29:57.545806563 +0000 UTC m=+353.531256606" lastFinishedPulling="2026-04-16 04:30:00.939608352 +0000 UTC m=+356.925058408" observedRunningTime="2026-04-16 04:30:01.885150886 +0000 UTC m=+357.870600951" watchObservedRunningTime="2026-04-16 04:30:01.886666771 +0000 UTC m=+357.872116836" Apr 16 04:30:02.006222 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:02.006175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:02.008495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:02.008469 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a850ebad-0876-49e7-a7cf-5b05d5b500ec-cert\") pod \"odh-model-controller-858dbf95b8-7bxmc\" (UID: \"a850ebad-0876-49e7-a7cf-5b05d5b500ec\") " pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:02.086308 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:02.086269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:02.210494 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:02.210468 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7bxmc"] Apr 16 04:30:02.212933 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:30:02.212904 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda850ebad_0876_49e7_a7cf_5b05d5b500ec.slice/crio-bef73cc75f32d9cd9d63c8190bb815b779dec69b4f942ec1714d2914d673b688 WatchSource:0}: Error finding container bef73cc75f32d9cd9d63c8190bb815b779dec69b4f942ec1714d2914d673b688: Status 404 returned error can't find the container with id bef73cc75f32d9cd9d63c8190bb815b779dec69b4f942ec1714d2914d673b688 Apr 16 04:30:02.875907 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:02.875858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" event={"ID":"a850ebad-0876-49e7-a7cf-5b05d5b500ec","Type":"ContainerStarted","Data":"bef73cc75f32d9cd9d63c8190bb815b779dec69b4f942ec1714d2914d673b688"} Apr 16 04:30:05.495551 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:05.490713 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda850ebad_0876_49e7_a7cf_5b05d5b500ec.slice/crio-26aba7f45ecc76b80b27295f3df216dd8bd72a9d0246c6608ea5416621dacc87.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda850ebad_0876_49e7_a7cf_5b05d5b500ec.slice/crio-conmon-26aba7f45ecc76b80b27295f3df216dd8bd72a9d0246c6608ea5416621dacc87.scope\": RecentStats: unable to find data in memory cache]" Apr 16 04:30:05.888554 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.888513 2567 generic.go:358] "Generic (PLEG): container finished" podID="a850ebad-0876-49e7-a7cf-5b05d5b500ec" containerID="26aba7f45ecc76b80b27295f3df216dd8bd72a9d0246c6608ea5416621dacc87" exitCode=1 Apr 16 04:30:05.888813 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.888605 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" event={"ID":"a850ebad-0876-49e7-a7cf-5b05d5b500ec","Type":"ContainerDied","Data":"26aba7f45ecc76b80b27295f3df216dd8bd72a9d0246c6608ea5416621dacc87"} Apr 16 04:30:05.888921 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.888877 2567 scope.go:117] "RemoveContainer" containerID="26aba7f45ecc76b80b27295f3df216dd8bd72a9d0246c6608ea5416621dacc87" Apr 16 04:30:05.897978 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.897942 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs8tv"] Apr 16 04:30:05.901669 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.901647 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:05.904296 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.904273 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 04:30:05.904437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.904292 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-ff65k\"" Apr 16 04:30:05.914358 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.914329 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs8tv"] Apr 16 04:30:05.947044 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.947011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:05.947217 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:05.947051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mgxw\" (UniqueName: \"kubernetes.io/projected/c21c3371-b668-4261-a94f-0d5af069b04c-kube-api-access-2mgxw\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.048013 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.047976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.048197 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.048025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mgxw\" (UniqueName: \"kubernetes.io/projected/c21c3371-b668-4261-a94f-0d5af069b04c-kube-api-access-2mgxw\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.048197 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:06.048154 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 04:30:06.048274 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:30:06.048233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert podName:c21c3371-b668-4261-a94f-0d5af069b04c nodeName:}" failed. No retries permitted until 2026-04-16 04:30:06.548213003 +0000 UTC m=+362.533663046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert") pod "kserve-controller-manager-856948b99f-bs8tv" (UID: "c21c3371-b668-4261-a94f-0d5af069b04c") : secret "kserve-webhook-server-cert" not found Apr 16 04:30:06.056665 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.056635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mgxw\" (UniqueName: \"kubernetes.io/projected/c21c3371-b668-4261-a94f-0d5af069b04c-kube-api-access-2mgxw\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.554308 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.554273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.556738 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.556713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c21c3371-b668-4261-a94f-0d5af069b04c-cert\") pod \"kserve-controller-manager-856948b99f-bs8tv\" (UID: \"c21c3371-b668-4261-a94f-0d5af069b04c\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.813803 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.813706 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:06.894071 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.894039 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" event={"ID":"a850ebad-0876-49e7-a7cf-5b05d5b500ec","Type":"ContainerStarted","Data":"dec7279c18b1080df6aee4ea177d3595e4041f9dde375444d506217cb6b438fb"} Apr 16 04:30:06.894256 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.894212 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:06.917874 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.917797 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" podStartSLOduration=2.990029674 podStartE2EDuration="6.917776188s" podCreationTimestamp="2026-04-16 04:30:00 +0000 UTC" firstStartedPulling="2026-04-16 04:30:02.214385663 +0000 UTC m=+358.199835705" lastFinishedPulling="2026-04-16 04:30:06.142132172 +0000 UTC m=+362.127582219" observedRunningTime="2026-04-16 04:30:06.917602649 +0000 UTC m=+362.903052716" watchObservedRunningTime="2026-04-16 04:30:06.917776188 +0000 UTC m=+362.903226254" Apr 16 04:30:06.944159 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:06.944117 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs8tv"] Apr 16 04:30:06.947614 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:30:06.947582 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21c3371_b668_4261_a94f_0d5af069b04c.slice/crio-6268ed94b3ba2e7f6abfc63885bbb221a85143913856f0c4b5f79a242301520e WatchSource:0}: Error finding container 6268ed94b3ba2e7f6abfc63885bbb221a85143913856f0c4b5f79a242301520e: Status 404 returned error can't find the container with id 6268ed94b3ba2e7f6abfc63885bbb221a85143913856f0c4b5f79a242301520e Apr 16 04:30:07.899001 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:07.898964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" event={"ID":"c21c3371-b668-4261-a94f-0d5af069b04c","Type":"ContainerStarted","Data":"6268ed94b3ba2e7f6abfc63885bbb221a85143913856f0c4b5f79a242301520e"} Apr 16 04:30:09.908246 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:09.908203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" event={"ID":"c21c3371-b668-4261-a94f-0d5af069b04c","Type":"ContainerStarted","Data":"29658f27b0c0f455893a0df10c4e769cc889250cb110f6c9883b05658812f5b7"} Apr 16 04:30:09.908651 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:09.908317 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:30:09.930134 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:09.930084 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" podStartSLOduration=2.498682948 podStartE2EDuration="4.930069245s" podCreationTimestamp="2026-04-16 04:30:05 +0000 UTC" firstStartedPulling="2026-04-16 04:30:06.949096786 +0000 UTC m=+362.934546827" lastFinishedPulling="2026-04-16 04:30:09.380483074 +0000 UTC m=+365.365933124" observedRunningTime="2026-04-16 04:30:09.926441659 +0000 UTC m=+365.911891724" watchObservedRunningTime="2026-04-16 04:30:09.930069245 +0000 UTC m=+365.915519310" Apr 16 04:30:11.502903 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.502863 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4"] Apr 16 04:30:11.506382 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.506365 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.509571 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.509548 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 04:30:11.509696 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.509546 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 04:30:11.509696 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.509586 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-2wqlq\"" Apr 16 04:30:11.516212 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.516189 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4"] Apr 16 04:30:11.601448 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.601407 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/633efbb9-453d-4660-9aa1-cf2e412f3b41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.601623 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.601473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmszm\" (UniqueName: \"kubernetes.io/projected/633efbb9-453d-4660-9aa1-cf2e412f3b41-kube-api-access-cmszm\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.702475 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.702431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmszm\" (UniqueName: \"kubernetes.io/projected/633efbb9-453d-4660-9aa1-cf2e412f3b41-kube-api-access-cmszm\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.702678 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.702512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/633efbb9-453d-4660-9aa1-cf2e412f3b41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.705115 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.705092 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/633efbb9-453d-4660-9aa1-cf2e412f3b41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.710643 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.710621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmszm\" (UniqueName: \"kubernetes.io/projected/633efbb9-453d-4660-9aa1-cf2e412f3b41-kube-api-access-cmszm\") pod \"servicemesh-operator3-55f49c5f94-mzjh4\" (UID: \"633efbb9-453d-4660-9aa1-cf2e412f3b41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.816281 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.816192 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:11.963137 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:11.963108 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4"] Apr 16 04:30:11.965665 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:30:11.965632 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod633efbb9_453d_4660_9aa1_cf2e412f3b41.slice/crio-20e60b351127979cc85845dcd64dcf625aa09436dbeaed81d874c7562b687415 WatchSource:0}: Error finding container 20e60b351127979cc85845dcd64dcf625aa09436dbeaed81d874c7562b687415: Status 404 returned error can't find the container with id 20e60b351127979cc85845dcd64dcf625aa09436dbeaed81d874c7562b687415 Apr 16 04:30:12.922276 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:12.922241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" event={"ID":"633efbb9-453d-4660-9aa1-cf2e412f3b41","Type":"ContainerStarted","Data":"20e60b351127979cc85845dcd64dcf625aa09436dbeaed81d874c7562b687415"} Apr 16 04:30:14.932247 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:14.932155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" event={"ID":"633efbb9-453d-4660-9aa1-cf2e412f3b41","Type":"ContainerStarted","Data":"08f99979ea11520fc81e84603aee3a413552cdbfd9048121ea2908a7ebfd7747"} Apr 16 04:30:14.932786 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:14.932274 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:14.950182 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:14.950126 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" podStartSLOduration=1.353303178 podStartE2EDuration="3.950064181s" podCreationTimestamp="2026-04-16 04:30:11 +0000 UTC" firstStartedPulling="2026-04-16 04:30:11.968345885 +0000 UTC m=+367.953795928" lastFinishedPulling="2026-04-16 04:30:14.565106883 +0000 UTC m=+370.550556931" observedRunningTime="2026-04-16 04:30:14.949267193 +0000 UTC m=+370.934717257" watchObservedRunningTime="2026-04-16 04:30:14.950064181 +0000 UTC m=+370.935514245" Apr 16 04:30:17.901561 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:17.901526 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7bxmc" Apr 16 04:30:25.938004 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:25.937972 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mzjh4" Apr 16 04:30:26.845650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.845606 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz"] Apr 16 04:30:26.849385 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.849361 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.853320 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.853288 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 04:30:26.853450 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.853337 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 04:30:26.853450 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.853366 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 04:30:26.853635 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.853619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-cp8rd\"" Apr 16 04:30:26.854621 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.854595 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 04:30:26.876566 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.876537 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz"] Apr 16 04:30:26.942088 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942057 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz58r\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-kube-api-access-tz58r\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942124 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aae2b43c-b3a7-419a-8da3-5fd4e1646843-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942196 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:26.942454 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:26.942369 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.042807 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.042807 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043090 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz58r\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-kube-api-access-tz58r\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043090 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aae2b43c-b3a7-419a-8da3-5fd4e1646843-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043090 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043090 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.042993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043327 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.043297 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.043605 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.043580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.045595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.045563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.045699 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.045563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.045699 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.045641 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.045808 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.045730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aae2b43c-b3a7-419a-8da3-5fd4e1646843-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.058401 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.058376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz58r\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-kube-api-access-tz58r\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.061747 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.061724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aae2b43c-b3a7-419a-8da3-5fd4e1646843-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fzjjz\" (UID: \"aae2b43c-b3a7-419a-8da3-5fd4e1646843\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.159688 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.159600 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:27.298190 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.298169 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz"] Apr 16 04:30:27.300420 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:30:27.300385 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae2b43c_b3a7_419a_8da3_5fd4e1646843.slice/crio-e9db012d973b1334d7e6f2d473a9b26a911d879f2e0b6d845073c5a1ec1ac0bb WatchSource:0}: Error finding container e9db012d973b1334d7e6f2d473a9b26a911d879f2e0b6d845073c5a1ec1ac0bb: Status 404 returned error can't find the container with id e9db012d973b1334d7e6f2d473a9b26a911d879f2e0b6d845073c5a1ec1ac0bb Apr 16 04:30:27.978045 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:27.978015 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" event={"ID":"aae2b43c-b3a7-419a-8da3-5fd4e1646843","Type":"ContainerStarted","Data":"e9db012d973b1334d7e6f2d473a9b26a911d879f2e0b6d845073c5a1ec1ac0bb"} Apr 16 04:30:29.938875 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:29.938814 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:30:29.939149 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:29.938903 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:30:30.991165 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:30.991115 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" event={"ID":"aae2b43c-b3a7-419a-8da3-5fd4e1646843","Type":"ContainerStarted","Data":"dceda1a961000167504da3308dc4260c0136224e72081fd10b82eb75be900e04"} Apr 16 04:30:30.991676 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:30.991241 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:31.028530 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:31.028479 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" podStartSLOduration=2.392311958 podStartE2EDuration="5.028460665s" podCreationTimestamp="2026-04-16 04:30:26 +0000 UTC" firstStartedPulling="2026-04-16 04:30:27.302385761 +0000 UTC m=+383.287835807" lastFinishedPulling="2026-04-16 04:30:29.938534471 +0000 UTC m=+385.923984514" observedRunningTime="2026-04-16 04:30:31.026605967 +0000 UTC m=+387.012056034" watchObservedRunningTime="2026-04-16 04:30:31.028460665 +0000 UTC m=+387.013910729" Apr 16 04:30:31.996421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:31.996393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fzjjz" Apr 16 04:30:40.918734 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:30:40.918703 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-bs8tv" Apr 16 04:31:36.149650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.149568 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4cwbl"] Apr 16 04:31:36.152087 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.152070 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:36.155137 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.155117 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4r9q6\"" Apr 16 04:31:36.155262 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.155117 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 04:31:36.156434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.156418 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 04:31:36.163505 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.163481 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4cwbl"] Apr 16 04:31:36.166018 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.165996 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctz8c\" (UniqueName: \"kubernetes.io/projected/63284ba8-4e60-4831-a141-b64515390a20-kube-api-access-ctz8c\") pod \"authorino-operator-657f44b778-4cwbl\" (UID: \"63284ba8-4e60-4831-a141-b64515390a20\") " pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:36.266789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.266756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctz8c\" (UniqueName: \"kubernetes.io/projected/63284ba8-4e60-4831-a141-b64515390a20-kube-api-access-ctz8c\") pod \"authorino-operator-657f44b778-4cwbl\" (UID: \"63284ba8-4e60-4831-a141-b64515390a20\") " pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:36.277549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.277522 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctz8c\" (UniqueName: \"kubernetes.io/projected/63284ba8-4e60-4831-a141-b64515390a20-kube-api-access-ctz8c\") pod \"authorino-operator-657f44b778-4cwbl\" (UID: \"63284ba8-4e60-4831-a141-b64515390a20\") " pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:36.462636 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.462549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:36.594296 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:36.594267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4cwbl"] Apr 16 04:31:36.596717 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:31:36.596687 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63284ba8_4e60_4831_a141_b64515390a20.slice/crio-3992dc626612e09d5b965d7a6791af9e063d5c2cc031b67d09106cbbe135c1ec WatchSource:0}: Error finding container 3992dc626612e09d5b965d7a6791af9e063d5c2cc031b67d09106cbbe135c1ec: Status 404 returned error can't find the container with id 3992dc626612e09d5b965d7a6791af9e063d5c2cc031b67d09106cbbe135c1ec Apr 16 04:31:37.221530 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:37.221493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" event={"ID":"63284ba8-4e60-4831-a141-b64515390a20","Type":"ContainerStarted","Data":"3992dc626612e09d5b965d7a6791af9e063d5c2cc031b67d09106cbbe135c1ec"} Apr 16 04:31:39.236639 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:39.236598 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" event={"ID":"63284ba8-4e60-4831-a141-b64515390a20","Type":"ContainerStarted","Data":"5fc6534184b4936a9c79f732b5a2e1ebdb7354292a819e1e5942fb7b2c7f744d"} Apr 16 04:31:39.237025 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:39.236719 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:39.258526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:39.258464 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" podStartSLOduration=1.562097735 podStartE2EDuration="3.258447292s" podCreationTimestamp="2026-04-16 04:31:36 +0000 UTC" firstStartedPulling="2026-04-16 04:31:36.599267567 +0000 UTC m=+452.584717613" lastFinishedPulling="2026-04-16 04:31:38.295617124 +0000 UTC m=+454.281067170" observedRunningTime="2026-04-16 04:31:39.255667901 +0000 UTC m=+455.241117966" watchObservedRunningTime="2026-04-16 04:31:39.258447292 +0000 UTC m=+455.243897394" Apr 16 04:31:40.547896 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.547857 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq"] Apr 16 04:31:40.550659 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.550635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.553257 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.553224 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ppqd7\"" Apr 16 04:31:40.563138 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.563110 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq"] Apr 16 04:31:40.605092 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.605056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.605278 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.605107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszks\" (UniqueName: \"kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.705912 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.705871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.706067 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.705926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qszks\" (UniqueName: \"kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.706345 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.706324 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.720597 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.720575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszks\" (UniqueName: \"kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:40.861699 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:40.861616 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:41.003672 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:41.003651 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq"] Apr 16 04:31:41.006191 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:31:41.006161 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b15923a_e23f_4198_8848_6892bbb9db6c.slice/crio-de19e7ba5707de1898f08deaf07b9b5811511a47569da67db4dbcec9662ce798 WatchSource:0}: Error finding container de19e7ba5707de1898f08deaf07b9b5811511a47569da67db4dbcec9662ce798: Status 404 returned error can't find the container with id de19e7ba5707de1898f08deaf07b9b5811511a47569da67db4dbcec9662ce798 Apr 16 04:31:41.244816 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:41.244780 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" event={"ID":"2b15923a-e23f-4198-8848-6892bbb9db6c","Type":"ContainerStarted","Data":"de19e7ba5707de1898f08deaf07b9b5811511a47569da67db4dbcec9662ce798"} Apr 16 04:31:47.270409 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:47.270363 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" event={"ID":"2b15923a-e23f-4198-8848-6892bbb9db6c","Type":"ContainerStarted","Data":"1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85"} Apr 16 04:31:47.270883 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:47.270488 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:47.291095 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:47.291040 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" podStartSLOduration=1.361545727 podStartE2EDuration="7.29102505s" podCreationTimestamp="2026-04-16 04:31:40 +0000 UTC" firstStartedPulling="2026-04-16 04:31:41.008555849 +0000 UTC m=+456.994005890" lastFinishedPulling="2026-04-16 04:31:46.938035171 +0000 UTC m=+462.923485213" observedRunningTime="2026-04-16 04:31:47.288366113 +0000 UTC m=+463.273816179" watchObservedRunningTime="2026-04-16 04:31:47.29102505 +0000 UTC m=+463.276475114" Apr 16 04:31:50.242096 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:50.242062 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-4cwbl" Apr 16 04:31:58.276661 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:58.276629 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:59.208517 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.208478 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq"] Apr 16 04:31:59.208732 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.208704 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" containerName="manager" containerID="cri-o://1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85" gracePeriod=2 Apr 16 04:31:59.221500 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.221471 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq"] Apr 16 04:31:59.250254 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.250220 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:31:59.250639 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.250623 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" containerName="manager" Apr 16 04:31:59.250639 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.250640 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" containerName="manager" Apr 16 04:31:59.250792 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.250738 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" containerName="manager" Apr 16 04:31:59.253030 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.252657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.255042 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.255008 2567 status_manager.go:895] "Failed to get status for pod" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" is forbidden: User \"system:node:ip-10-0-133-81.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-81.ec2.internal' and this object" Apr 16 04:31:59.267578 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.267547 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:31:59.375424 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.375392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lsj8\" (UniqueName: \"kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.375752 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.375463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.435447 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.435424 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:31:59.437837 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.437804 2567 status_manager.go:895] "Failed to get status for pod" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" is forbidden: User \"system:node:ip-10-0-133-81.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-81.ec2.internal' and this object" Apr 16 04:31:59.476487 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.476393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.476621 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.476522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lsj8\" (UniqueName: \"kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.476797 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.476772 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.486488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.486460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lsj8\" (UniqueName: \"kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-swpxz\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.577443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.577404 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume\") pod \"2b15923a-e23f-4198-8848-6892bbb9db6c\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " Apr 16 04:31:59.577443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.577448 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qszks\" (UniqueName: \"kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks\") pod \"2b15923a-e23f-4198-8848-6892bbb9db6c\" (UID: \"2b15923a-e23f-4198-8848-6892bbb9db6c\") " Apr 16 04:31:59.578018 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.577994 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2b15923a-e23f-4198-8848-6892bbb9db6c" (UID: "2b15923a-e23f-4198-8848-6892bbb9db6c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:31:59.579763 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.579736 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks" (OuterVolumeSpecName: "kube-api-access-qszks") pod "2b15923a-e23f-4198-8848-6892bbb9db6c" (UID: "2b15923a-e23f-4198-8848-6892bbb9db6c"). InnerVolumeSpecName "kube-api-access-qszks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:31:59.587923 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.587898 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:31:59.678604 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.678572 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b15923a-e23f-4198-8848-6892bbb9db6c-extensions-socket-volume\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:31:59.678604 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.678601 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qszks\" (UniqueName: \"kubernetes.io/projected/2b15923a-e23f-4198-8848-6892bbb9db6c-kube-api-access-qszks\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:31:59.719160 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:31:59.719131 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:31:59.721631 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:31:59.721603 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda33cf339_f9bf_4b3e_b069_47ba2029bd8d.slice/crio-727f9552884054436b1dc12b008d1fc7e30fe2549727076f57339c82f16fa362 WatchSource:0}: Error finding container 727f9552884054436b1dc12b008d1fc7e30fe2549727076f57339c82f16fa362: Status 404 returned error can't find the container with id 727f9552884054436b1dc12b008d1fc7e30fe2549727076f57339c82f16fa362 Apr 16 04:32:00.321707 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.321674 2567 generic.go:358] "Generic (PLEG): container finished" podID="2b15923a-e23f-4198-8848-6892bbb9db6c" containerID="1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85" exitCode=0 Apr 16 04:32:00.321921 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.321727 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" Apr 16 04:32:00.321921 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.321768 2567 scope.go:117] "RemoveContainer" containerID="1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85" Apr 16 04:32:00.323369 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.323345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" event={"ID":"a33cf339-f9bf-4b3e-b069-47ba2029bd8d","Type":"ContainerStarted","Data":"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc"} Apr 16 04:32:00.323495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.323379 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" event={"ID":"a33cf339-f9bf-4b3e-b069-47ba2029bd8d","Type":"ContainerStarted","Data":"727f9552884054436b1dc12b008d1fc7e30fe2549727076f57339c82f16fa362"} Apr 16 04:32:00.323495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.323454 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:32:00.324617 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.324591 2567 status_manager.go:895] "Failed to get status for pod" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" is forbidden: User \"system:node:ip-10-0-133-81.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-81.ec2.internal' and this object" Apr 16 04:32:00.327023 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.326997 2567 status_manager.go:895] "Failed to get status for pod" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" is forbidden: User \"system:node:ip-10-0-133-81.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-81.ec2.internal' and this object" Apr 16 04:32:00.338184 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.338161 2567 scope.go:117] "RemoveContainer" containerID="1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85" Apr 16 04:32:00.338532 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:32:00.338511 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85\": container with ID starting with 1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85 not found: ID does not exist" containerID="1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85" Apr 16 04:32:00.338608 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.338542 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85"} err="failed to get container status \"1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85\": rpc error: code = NotFound desc = could not find container \"1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85\": container with ID starting with 1a0808df25b1014e157288e6c89e5c4b448c426a006a3a56d6524a96bca49f85 not found: ID does not exist" Apr 16 04:32:00.366368 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.366315 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" podStartSLOduration=1.3663024670000001 podStartE2EDuration="1.366302467s" podCreationTimestamp="2026-04-16 04:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:32:00.3653571 +0000 UTC m=+476.350807166" watchObservedRunningTime="2026-04-16 04:32:00.366302467 +0000 UTC m=+476.351752531" Apr 16 04:32:00.367905 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.367859 2567 status_manager.go:895] "Failed to get status for pod" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vqwtq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vqwtq\" is forbidden: User \"system:node:ip-10-0-133-81.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-81.ec2.internal' and this object" Apr 16 04:32:00.674669 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:00.674594 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b15923a-e23f-4198-8848-6892bbb9db6c" path="/var/lib/kubelet/pods/2b15923a-e23f-4198-8848-6892bbb9db6c/volumes" Apr 16 04:32:11.330423 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:11.330390 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:32:15.472394 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.472340 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:32:15.472945 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.472640 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" podUID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" containerName="manager" containerID="cri-o://8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc" gracePeriod=10 Apr 16 04:32:15.727146 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.727070 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:32:15.832127 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.832085 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lsj8\" (UniqueName: \"kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8\") pod \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " Apr 16 04:32:15.832334 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.832151 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume\") pod \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\" (UID: \"a33cf339-f9bf-4b3e-b069-47ba2029bd8d\") " Apr 16 04:32:15.832541 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.832514 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "a33cf339-f9bf-4b3e-b069-47ba2029bd8d" (UID: "a33cf339-f9bf-4b3e-b069-47ba2029bd8d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 04:32:15.834214 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.834182 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8" (OuterVolumeSpecName: "kube-api-access-8lsj8") pod "a33cf339-f9bf-4b3e-b069-47ba2029bd8d" (UID: "a33cf339-f9bf-4b3e-b069-47ba2029bd8d"). InnerVolumeSpecName "kube-api-access-8lsj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:32:15.933369 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.933325 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8lsj8\" (UniqueName: \"kubernetes.io/projected/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-kube-api-access-8lsj8\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:32:15.933369 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:15.933362 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a33cf339-f9bf-4b3e-b069-47ba2029bd8d-extensions-socket-volume\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:32:16.383226 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.383186 2567 generic.go:358] "Generic (PLEG): container finished" podID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" containerID="8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc" exitCode=0 Apr 16 04:32:16.383437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.383258 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" Apr 16 04:32:16.383437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.383277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" event={"ID":"a33cf339-f9bf-4b3e-b069-47ba2029bd8d","Type":"ContainerDied","Data":"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc"} Apr 16 04:32:16.383437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.383313 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz" event={"ID":"a33cf339-f9bf-4b3e-b069-47ba2029bd8d","Type":"ContainerDied","Data":"727f9552884054436b1dc12b008d1fc7e30fe2549727076f57339c82f16fa362"} Apr 16 04:32:16.383437 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.383328 2567 scope.go:117] "RemoveContainer" containerID="8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc" Apr 16 04:32:16.392822 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.392804 2567 scope.go:117] "RemoveContainer" containerID="8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc" Apr 16 04:32:16.393120 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:32:16.393102 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc\": container with ID starting with 8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc not found: ID does not exist" containerID="8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc" Apr 16 04:32:16.393168 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.393128 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc"} err="failed to get container status \"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc\": rpc error: code = NotFound desc = could not find container \"8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc\": container with ID starting with 8d953617caa0d3a589bd013f7906f6668ace49c8ec6adf5cf3c90fcaeb4fd7dc not found: ID does not exist" Apr 16 04:32:16.422011 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.421972 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:32:16.430870 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.430843 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-swpxz"] Apr 16 04:32:16.675090 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:16.674997 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" path="/var/lib/kubelet/pods/a33cf339-f9bf-4b3e-b069-47ba2029bd8d/volumes" Apr 16 04:32:31.656164 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.656119 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww"] Apr 16 04:32:31.656771 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.656741 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" containerName="manager" Apr 16 04:32:31.656771 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.656772 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" containerName="manager" Apr 16 04:32:31.656997 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.656978 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a33cf339-f9bf-4b3e-b069-47ba2029bd8d" containerName="manager" Apr 16 04:32:31.660613 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.660578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.663735 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.663700 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-jlr4c\"" Apr 16 04:32:31.673009 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.672974 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww"] Apr 16 04:32:31.774109 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774109 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774199 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpv7v\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-kube-api-access-rpv7v\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774311 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774463 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b0f6588a-939a-44db-935c-149b41f7b7ad-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.774463 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.774351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875327 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875280 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875337 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875410 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpv7v\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-kube-api-access-rpv7v\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875501 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875497 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b0f6588a-939a-44db-935c-149b41f7b7ad-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875757 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.875938 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.875806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.876028 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.876007 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.876107 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.876084 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.876223 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.876202 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.876495 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.876472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b0f6588a-939a-44db-935c-149b41f7b7ad-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.878127 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.878108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.878442 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.878424 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.883028 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.883001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.883347 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.883309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpv7v\" (UniqueName: \"kubernetes.io/projected/b0f6588a-939a-44db-935c-149b41f7b7ad-kube-api-access-rpv7v\") pod \"maas-default-gateway-openshift-default-58b6f876-tp7ww\" (UID: \"b0f6588a-939a-44db-935c-149b41f7b7ad\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:31.974963 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:31.974865 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:32.119618 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:32.119572 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww"] Apr 16 04:32:32.123266 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:32:32.123221 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f6588a_939a_44db_935c_149b41f7b7ad.slice/crio-296d1ef4624df505377713941222e0991a928cda7fade6d7f7fa2b529ccbf7cb WatchSource:0}: Error finding container 296d1ef4624df505377713941222e0991a928cda7fade6d7f7fa2b529ccbf7cb: Status 404 returned error can't find the container with id 296d1ef4624df505377713941222e0991a928cda7fade6d7f7fa2b529ccbf7cb Apr 16 04:32:32.472849 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:32.471339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" event={"ID":"b0f6588a-939a-44db-935c-149b41f7b7ad","Type":"ContainerStarted","Data":"296d1ef4624df505377713941222e0991a928cda7fade6d7f7fa2b529ccbf7cb"} Apr 16 04:32:34.868162 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:34.868125 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:32:34.868441 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:34.868197 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:32:34.868441 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:34.868223 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 04:32:35.485727 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:35.485688 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" event={"ID":"b0f6588a-939a-44db-935c-149b41f7b7ad","Type":"ContainerStarted","Data":"f38e409cb5ef95ec6a9775930136a2e694c110e56646e737574c2adb36116845"} Apr 16 04:32:35.506258 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:35.506207 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" podStartSLOduration=1.7642517180000001 podStartE2EDuration="4.506190105s" podCreationTimestamp="2026-04-16 04:32:31 +0000 UTC" firstStartedPulling="2026-04-16 04:32:32.125851577 +0000 UTC m=+508.111301633" lastFinishedPulling="2026-04-16 04:32:34.867789964 +0000 UTC m=+510.853240020" observedRunningTime="2026-04-16 04:32:35.505280947 +0000 UTC m=+511.490731011" watchObservedRunningTime="2026-04-16 04:32:35.506190105 +0000 UTC m=+511.491640168" Apr 16 04:32:35.975086 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:35.975045 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:35.980676 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:35.980647 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:36.216504 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.216420 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:36.220132 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.220106 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.222772 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.222748 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 04:32:36.222963 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.222814 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5kqt9\"" Apr 16 04:32:36.228250 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.228210 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:36.314507 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.314472 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:36.321283 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.321252 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.321441 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.321323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw54\" (UniqueName: \"kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.422302 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.422261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw54\" (UniqueName: \"kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.422470 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.422373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.423239 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.423210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.431162 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.431126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw54\" (UniqueName: \"kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54\") pod \"limitador-limitador-7d549b5b-rmxnz\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.489499 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.489469 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:36.490717 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.490693 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-tp7ww" Apr 16 04:32:36.540489 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.540453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:36.682538 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:36.682516 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:36.684986 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:32:36.684956 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0bbeb5b_a478_42b0_9710_8970c3f698fb.slice/crio-a209e24b31d149be7e785028a1feb558a7d98524c702ab2710a93d14069a3021 WatchSource:0}: Error finding container a209e24b31d149be7e785028a1feb558a7d98524c702ab2710a93d14069a3021: Status 404 returned error can't find the container with id a209e24b31d149be7e785028a1feb558a7d98524c702ab2710a93d14069a3021 Apr 16 04:32:37.501699 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:37.501628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" event={"ID":"c0bbeb5b-a478-42b0-9710-8970c3f698fb","Type":"ContainerStarted","Data":"a209e24b31d149be7e785028a1feb558a7d98524c702ab2710a93d14069a3021"} Apr 16 04:32:39.509650 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:39.509613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" event={"ID":"c0bbeb5b-a478-42b0-9710-8970c3f698fb","Type":"ContainerStarted","Data":"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87"} Apr 16 04:32:39.510227 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:39.509733 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:39.525620 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:39.525569 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" podStartSLOduration=0.855943376 podStartE2EDuration="3.525553881s" podCreationTimestamp="2026-04-16 04:32:36 +0000 UTC" firstStartedPulling="2026-04-16 04:32:36.686972578 +0000 UTC m=+512.672422635" lastFinishedPulling="2026-04-16 04:32:39.356583097 +0000 UTC m=+515.342033140" observedRunningTime="2026-04-16 04:32:39.523725092 +0000 UTC m=+515.509175158" watchObservedRunningTime="2026-04-16 04:32:39.525553881 +0000 UTC m=+515.511003995" Apr 16 04:32:50.514171 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:50.514087 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:54.466054 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:54.466014 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:54.466517 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:54.466253 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" podUID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" containerName="limitador" containerID="cri-o://2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87" gracePeriod=30 Apr 16 04:32:55.004588 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.004561 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:55.093789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.093689 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqw54\" (UniqueName: \"kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54\") pod \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " Apr 16 04:32:55.093789 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.093782 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file\") pod \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\" (UID: \"c0bbeb5b-a478-42b0-9710-8970c3f698fb\") " Apr 16 04:32:55.094186 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.094158 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file" (OuterVolumeSpecName: "config-file") pod "c0bbeb5b-a478-42b0-9710-8970c3f698fb" (UID: "c0bbeb5b-a478-42b0-9710-8970c3f698fb"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 04:32:55.095914 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.095883 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54" (OuterVolumeSpecName: "kube-api-access-vqw54") pod "c0bbeb5b-a478-42b0-9710-8970c3f698fb" (UID: "c0bbeb5b-a478-42b0-9710-8970c3f698fb"). InnerVolumeSpecName "kube-api-access-vqw54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 04:32:55.194857 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.194800 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqw54\" (UniqueName: \"kubernetes.io/projected/c0bbeb5b-a478-42b0-9710-8970c3f698fb-kube-api-access-vqw54\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:32:55.194857 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.194845 2567 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c0bbeb5b-a478-42b0-9710-8970c3f698fb-config-file\") on node \"ip-10-0-133-81.ec2.internal\" DevicePath \"\"" Apr 16 04:32:55.569915 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.569882 2567 generic.go:358] "Generic (PLEG): container finished" podID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" containerID="2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87" exitCode=0 Apr 16 04:32:55.570471 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.569964 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" Apr 16 04:32:55.570471 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.569974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" event={"ID":"c0bbeb5b-a478-42b0-9710-8970c3f698fb","Type":"ContainerDied","Data":"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87"} Apr 16 04:32:55.570471 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.570012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rmxnz" event={"ID":"c0bbeb5b-a478-42b0-9710-8970c3f698fb","Type":"ContainerDied","Data":"a209e24b31d149be7e785028a1feb558a7d98524c702ab2710a93d14069a3021"} Apr 16 04:32:55.570471 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.570028 2567 scope.go:117] "RemoveContainer" containerID="2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87" Apr 16 04:32:55.578720 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.578700 2567 scope.go:117] "RemoveContainer" containerID="2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87" Apr 16 04:32:55.579060 ip-10-0-133-81 kubenswrapper[2567]: E0416 04:32:55.579040 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87\": container with ID starting with 2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87 not found: ID does not exist" containerID="2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87" Apr 16 04:32:55.579134 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.579073 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87"} err="failed to get container status \"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87\": rpc error: code = NotFound desc = could not find container \"2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87\": container with ID starting with 2e8cd8958750679cda1378cea36c2ea016bcbb9e945c2a727223481727d49f87 not found: ID does not exist" Apr 16 04:32:55.591336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.591302 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:55.595174 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:55.595144 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rmxnz"] Apr 16 04:32:56.675288 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:56.675248 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" path="/var/lib/kubelet/pods/c0bbeb5b-a478-42b0-9710-8970c3f698fb/volumes" Apr 16 04:32:57.249188 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.249156 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-mk8cn"] Apr 16 04:32:57.249595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.249581 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" containerName="limitador" Apr 16 04:32:57.249648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.249597 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" containerName="limitador" Apr 16 04:32:57.249692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.249682 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0bbeb5b-a478-42b0-9710-8970c3f698fb" containerName="limitador" Apr 16 04:32:57.253941 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.253920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.257001 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.256973 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 04:32:57.257201 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.257002 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-zng7b\"" Apr 16 04:32:57.259231 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.259203 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mk8cn"] Apr 16 04:32:57.415563 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.415513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvj7\" (UniqueName: \"kubernetes.io/projected/9c5f9d9d-ff46-4279-8750-f7eda58806b6-kube-api-access-vcvj7\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.415772 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.415607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9d9d-ff46-4279-8750-f7eda58806b6-data\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.516381 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.516276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvj7\" (UniqueName: \"kubernetes.io/projected/9c5f9d9d-ff46-4279-8750-f7eda58806b6-kube-api-access-vcvj7\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.516381 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.516344 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9d9d-ff46-4279-8750-f7eda58806b6-data\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.516699 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.516675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9d9d-ff46-4279-8750-f7eda58806b6-data\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.525180 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.525151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvj7\" (UniqueName: \"kubernetes.io/projected/9c5f9d9d-ff46-4279-8750-f7eda58806b6-kube-api-access-vcvj7\") pod \"postgres-868db5846d-mk8cn\" (UID: \"9c5f9d9d-ff46-4279-8750-f7eda58806b6\") " pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.568167 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.568130 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:32:57.700043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:57.700010 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mk8cn"] Apr 16 04:32:57.702203 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:32:57.702169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5f9d9d_ff46_4279_8750_f7eda58806b6.slice/crio-6ed82f199241812e5b5fae79f3892698478b0dc32c93bd3520fa8241cbdbf852 WatchSource:0}: Error finding container 6ed82f199241812e5b5fae79f3892698478b0dc32c93bd3520fa8241cbdbf852: Status 404 returned error can't find the container with id 6ed82f199241812e5b5fae79f3892698478b0dc32c93bd3520fa8241cbdbf852 Apr 16 04:32:58.587692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:32:58.587649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mk8cn" event={"ID":"9c5f9d9d-ff46-4279-8750-f7eda58806b6","Type":"ContainerStarted","Data":"6ed82f199241812e5b5fae79f3892698478b0dc32c93bd3520fa8241cbdbf852"} Apr 16 04:33:03.616483 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:03.616444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mk8cn" event={"ID":"9c5f9d9d-ff46-4279-8750-f7eda58806b6","Type":"ContainerStarted","Data":"f844063668a3c9ae7e349ccc4eeb8dcc15c7231bfeb0f4e2398440625d16c820"} Apr 16 04:33:03.616900 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:03.616602 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:33:03.633171 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:03.633108 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-mk8cn" podStartSLOduration=1.656404516 podStartE2EDuration="6.633085011s" podCreationTimestamp="2026-04-16 04:32:57 +0000 UTC" firstStartedPulling="2026-04-16 04:32:57.703558987 +0000 UTC m=+533.689009029" lastFinishedPulling="2026-04-16 04:33:02.680239482 +0000 UTC m=+538.665689524" observedRunningTime="2026-04-16 04:33:03.630515046 +0000 UTC m=+539.615965110" watchObservedRunningTime="2026-04-16 04:33:03.633085011 +0000 UTC m=+539.618535076" Apr 16 04:33:09.648332 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:09.648300 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-mk8cn" Apr 16 04:33:18.185679 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.185641 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-bc6ff"] Apr 16 04:33:18.195973 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.195911 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" Apr 16 04:33:18.198782 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.198749 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-bc6ff"] Apr 16 04:33:18.199922 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.199900 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 04:33:18.201072 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.201049 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 04:33:18.201189 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.201110 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-vvl2r\"" Apr 16 04:33:18.305717 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.305682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7t8k\" (UniqueName: \"kubernetes.io/projected/3b67351d-eacf-46de-a238-75a39a8602b5-kube-api-access-g7t8k\") pod \"keycloak-operator-5c4df598dd-bc6ff\" (UID: \"3b67351d-eacf-46de-a238-75a39a8602b5\") " pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" Apr 16 04:33:18.406548 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.406508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7t8k\" (UniqueName: \"kubernetes.io/projected/3b67351d-eacf-46de-a238-75a39a8602b5-kube-api-access-g7t8k\") pod \"keycloak-operator-5c4df598dd-bc6ff\" (UID: \"3b67351d-eacf-46de-a238-75a39a8602b5\") " pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" Apr 16 04:33:18.415819 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.415786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7t8k\" (UniqueName: \"kubernetes.io/projected/3b67351d-eacf-46de-a238-75a39a8602b5-kube-api-access-g7t8k\") pod \"keycloak-operator-5c4df598dd-bc6ff\" (UID: \"3b67351d-eacf-46de-a238-75a39a8602b5\") " pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" Apr 16 04:33:18.508085 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.508050 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" Apr 16 04:33:18.635731 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.635692 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-bc6ff"] Apr 16 04:33:18.639694 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:33:18.639659 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b67351d_eacf_46de_a238_75a39a8602b5.slice/crio-87ec59092ae388fb480837c7c328ec472ce73e7061088fec89d8bd1659eae268 WatchSource:0}: Error finding container 87ec59092ae388fb480837c7c328ec472ce73e7061088fec89d8bd1659eae268: Status 404 returned error can't find the container with id 87ec59092ae388fb480837c7c328ec472ce73e7061088fec89d8bd1659eae268 Apr 16 04:33:18.674706 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:18.674670 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" event={"ID":"3b67351d-eacf-46de-a238-75a39a8602b5","Type":"ContainerStarted","Data":"87ec59092ae388fb480837c7c328ec472ce73e7061088fec89d8bd1659eae268"} Apr 16 04:33:24.695843 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:24.695803 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" event={"ID":"3b67351d-eacf-46de-a238-75a39a8602b5","Type":"ContainerStarted","Data":"3974d7c2b7e1b78f5d91ecdf31056b4d1949500f2f047d91b10cd8f5ed30ee82"} Apr 16 04:33:24.712020 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:33:24.711974 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-bc6ff" podStartSLOduration=1.697014829 podStartE2EDuration="6.711960242s" podCreationTimestamp="2026-04-16 04:33:18 +0000 UTC" firstStartedPulling="2026-04-16 04:33:18.641210692 +0000 UTC m=+554.626660733" lastFinishedPulling="2026-04-16 04:33:23.656156104 +0000 UTC m=+559.641606146" observedRunningTime="2026-04-16 04:33:24.710345352 +0000 UTC m=+560.695795417" watchObservedRunningTime="2026-04-16 04:33:24.711960242 +0000 UTC m=+560.697410309" Apr 16 04:34:04.522861 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:34:04.522796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:34:04.524663 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:34:04.524639 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:34:04.537033 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:34:04.536993 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:34:04.539286 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:34:04.539264 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:35:38.774095 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.774056 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7"] Apr 16 04:35:38.777263 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.777232 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.779972 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.779940 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 04:35:38.780257 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.780236 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 04:35:38.781289 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.781253 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gmrh6\"" Apr 16 04:35:38.781403 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.781389 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 04:35:38.786541 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.786509 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7"] Apr 16 04:35:38.906743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.906709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfebf549-1ed4-4aec-9584-f6343fe655a3-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.906743 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.906759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.906979 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.906815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.906979 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.906889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.906979 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.906932 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:38.907082 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:38.907006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzhjj\" (UniqueName: \"kubernetes.io/projected/dfebf549-1ed4-4aec-9584-f6343fe655a3-kube-api-access-qzhjj\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007594 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007594 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzhjj\" (UniqueName: \"kubernetes.io/projected/dfebf549-1ed4-4aec-9584-f6343fe655a3-kube-api-access-qzhjj\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007872 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfebf549-1ed4-4aec-9584-f6343fe655a3-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007872 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007872 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.007872 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.007730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.008125 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.008098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.008187 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.008139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.008187 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.008176 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.010018 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.009994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dfebf549-1ed4-4aec-9584-f6343fe655a3-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.010230 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.010211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dfebf549-1ed4-4aec-9584-f6343fe655a3-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.015071 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.015049 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzhjj\" (UniqueName: \"kubernetes.io/projected/dfebf549-1ed4-4aec-9584-f6343fe655a3-kube-api-access-qzhjj\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7\" (UID: \"dfebf549-1ed4-4aec-9584-f6343fe655a3\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.091222 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.091131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:39.226057 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.226031 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7"] Apr 16 04:35:39.228464 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:35:39.228433 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfebf549_1ed4_4aec_9584_f6343fe655a3.slice/crio-e3f2611e9219f5341120a9d50348287b254f65ba14fdbe8cce258e6218da038e WatchSource:0}: Error finding container e3f2611e9219f5341120a9d50348287b254f65ba14fdbe8cce258e6218da038e: Status 404 returned error can't find the container with id e3f2611e9219f5341120a9d50348287b254f65ba14fdbe8cce258e6218da038e Apr 16 04:35:39.230292 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:39.230272 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:35:40.207460 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:40.207411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" event={"ID":"dfebf549-1ed4-4aec-9584-f6343fe655a3","Type":"ContainerStarted","Data":"e3f2611e9219f5341120a9d50348287b254f65ba14fdbe8cce258e6218da038e"} Apr 16 04:35:45.231481 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:45.231447 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" event={"ID":"dfebf549-1ed4-4aec-9584-f6343fe655a3","Type":"ContainerStarted","Data":"60cdce91c47bdfe5617fc90784a253c30b0cabcd06f99621724631ee1bbcb13a"} Apr 16 04:35:51.259493 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:51.259456 2567 generic.go:358] "Generic (PLEG): container finished" podID="dfebf549-1ed4-4aec-9584-f6343fe655a3" containerID="60cdce91c47bdfe5617fc90784a253c30b0cabcd06f99621724631ee1bbcb13a" exitCode=0 Apr 16 04:35:51.259903 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:51.259529 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" event={"ID":"dfebf549-1ed4-4aec-9584-f6343fe655a3","Type":"ContainerDied","Data":"60cdce91c47bdfe5617fc90784a253c30b0cabcd06f99621724631ee1bbcb13a"} Apr 16 04:35:56.285566 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:56.285527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" event={"ID":"dfebf549-1ed4-4aec-9584-f6343fe655a3","Type":"ContainerStarted","Data":"71656605d372a24a155a52f6d8f441bc2b7f74d53fd6321d53c5e723c2821563"} Apr 16 04:35:56.285995 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:56.285762 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:35:56.304329 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:35:56.304275 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" podStartSLOduration=2.214199797 podStartE2EDuration="18.304258788s" podCreationTimestamp="2026-04-16 04:35:38 +0000 UTC" firstStartedPulling="2026-04-16 04:35:39.230451842 +0000 UTC m=+695.215901888" lastFinishedPulling="2026-04-16 04:35:55.320510838 +0000 UTC m=+711.305960879" observedRunningTime="2026-04-16 04:35:56.301679867 +0000 UTC m=+712.287129945" watchObservedRunningTime="2026-04-16 04:35:56.304258788 +0000 UTC m=+712.289708852" Apr 16 04:36:07.303870 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:07.303818 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7" Apr 16 04:36:13.276795 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.276761 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw"] Apr 16 04:36:13.281032 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.281010 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.283595 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.283576 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 04:36:13.287716 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.287614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw"] Apr 16 04:36:13.336167 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.336336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336177 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.336336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.336336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336293 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.336336 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcn4p\" (UniqueName: \"kubernetes.io/projected/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kube-api-access-qcn4p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.336522 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.336353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437293 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437293 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcn4p\" (UniqueName: \"kubernetes.io/projected/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kube-api-access-qcn4p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437770 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437706 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.437891 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.437861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.438421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.438391 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.440416 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.440394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.440604 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.440584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.445931 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.445914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcn4p\" (UniqueName: \"kubernetes.io/projected/9b3f1f82-90bc-4e8b-a076-191c8a1b902c-kube-api-access-qcn4p\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw\" (UID: \"9b3f1f82-90bc-4e8b-a076-191c8a1b902c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.592410 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.592327 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:13.724328 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:13.724213 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw"] Apr 16 04:36:13.727158 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:36:13.727129 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3f1f82_90bc_4e8b_a076_191c8a1b902c.slice/crio-a84733984f244f11ca310eacf03b60749fe81d455eff7a73e192640fe5d88e65 WatchSource:0}: Error finding container a84733984f244f11ca310eacf03b60749fe81d455eff7a73e192640fe5d88e65: Status 404 returned error can't find the container with id a84733984f244f11ca310eacf03b60749fe81d455eff7a73e192640fe5d88e65 Apr 16 04:36:14.354734 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:14.354695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" event={"ID":"9b3f1f82-90bc-4e8b-a076-191c8a1b902c","Type":"ContainerStarted","Data":"5d423f4fe4c6643eb5188c139916268b5503bc5604c6eace0ec6331d602bb03a"} Apr 16 04:36:14.354734 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:14.354731 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" event={"ID":"9b3f1f82-90bc-4e8b-a076-191c8a1b902c","Type":"ContainerStarted","Data":"a84733984f244f11ca310eacf03b60749fe81d455eff7a73e192640fe5d88e65"} Apr 16 04:36:19.375072 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:19.375035 2567 generic.go:358] "Generic (PLEG): container finished" podID="9b3f1f82-90bc-4e8b-a076-191c8a1b902c" containerID="5d423f4fe4c6643eb5188c139916268b5503bc5604c6eace0ec6331d602bb03a" exitCode=0 Apr 16 04:36:19.375455 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:19.375110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" event={"ID":"9b3f1f82-90bc-4e8b-a076-191c8a1b902c","Type":"ContainerDied","Data":"5d423f4fe4c6643eb5188c139916268b5503bc5604c6eace0ec6331d602bb03a"} Apr 16 04:36:20.380987 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:20.380952 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" event={"ID":"9b3f1f82-90bc-4e8b-a076-191c8a1b902c","Type":"ContainerStarted","Data":"34782bc708612e2a0e1a353e2134355da93f219073b95363d05ac87bc7c08e0d"} Apr 16 04:36:20.381371 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:20.381166 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:36:20.398979 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:20.398935 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" podStartSLOduration=7.217294203 podStartE2EDuration="7.39892223s" podCreationTimestamp="2026-04-16 04:36:13 +0000 UTC" firstStartedPulling="2026-04-16 04:36:19.375724284 +0000 UTC m=+735.361174326" lastFinishedPulling="2026-04-16 04:36:19.557352298 +0000 UTC m=+735.542802353" observedRunningTime="2026-04-16 04:36:20.397080207 +0000 UTC m=+736.382530296" watchObservedRunningTime="2026-04-16 04:36:20.39892223 +0000 UTC m=+736.384372323" Apr 16 04:36:31.397146 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:36:31.397113 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw" Apr 16 04:39:04.559539 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:39:04.559459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:39:04.562928 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:39:04.562903 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:39:04.566668 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:39:04.566637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:39:04.569643 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:39:04.569615 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:44:04.587406 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:44:04.587372 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:44:04.594110 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:44:04.594086 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:44:04.594259 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:44:04.594087 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:44:04.601011 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:44:04.600990 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:49:04.616352 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:49:04.616234 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:49:04.623850 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:49:04.623800 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:49:04.624443 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:49:04.624417 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:49:04.631516 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:49:04.631494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:54:04.647047 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:04.646928 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:54:04.657081 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:04.657055 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:54:04.658725 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:04.658701 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:54:04.664850 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:04.664816 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:54:47.996630 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:47.996594 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bs8tv_c21c3371-b668-4261-a94f-0d5af069b04c/manager/0.log" Apr 16 04:54:48.375071 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:48.374990 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7bxmc_a850ebad-0876-49e7-a7cf-5b05d5b500ec/manager/1.log" Apr 16 04:54:48.606320 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:48.606276 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-pjqx2_2d4adf34-12c0-4e76-b7e2-e6c39c7e7355/manager/0.log" Apr 16 04:54:48.821406 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:48.821379 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mk8cn_9c5f9d9d-ff46-4279-8750-f7eda58806b6/postgres/0.log" Apr 16 04:54:50.197867 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:50.197806 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4cwbl_63284ba8-4e60-4831-a141-b64515390a20/manager/0.log" Apr 16 04:54:51.318844 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:51.318804 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fzjjz_aae2b43c-b3a7-419a-8da3-5fd4e1646843/discovery/0.log" Apr 16 04:54:51.427601 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:51.427567 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-68d6cc647-m6stc_a9d9ae6f-6579-4770-98c2-5083dadcb564/kube-auth-proxy/0.log" Apr 16 04:54:51.652707 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:51.652612 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-tp7ww_b0f6588a-939a-44db-935c-149b41f7b7ad/istio-proxy/0.log" Apr 16 04:54:52.351768 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:52.351737 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw_9b3f1f82-90bc-4e8b-a076-191c8a1b902c/storage-initializer/0.log" Apr 16 04:54:52.359266 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:52.359232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcchwwbw_9b3f1f82-90bc-4e8b-a076-191c8a1b902c/main/0.log" Apr 16 04:54:52.472429 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:52.472396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7_dfebf549-1ed4-4aec-9584-f6343fe655a3/storage-initializer/0.log" Apr 16 04:54:52.479356 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:52.479332 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-pn7q7_dfebf549-1ed4-4aec-9584-f6343fe655a3/main/0.log" Apr 16 04:54:59.197914 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:59.197884 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-28th9_dcdc9bd2-6cdd-48d1-850f-80adbc878d5f/global-pull-secret-syncer/0.log" Apr 16 04:54:59.386530 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:59.386494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-x8979_98bbb6ac-6605-4f70-9681-333eea1951c2/konnectivity-agent/0.log" Apr 16 04:54:59.430414 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:54:59.430375 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-81.ec2.internal_558b1f9822c72f9ef387e865da1b3b63/haproxy/0.log" Apr 16 04:55:04.104270 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:04.104184 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4cwbl_63284ba8-4e60-4831-a141-b64515390a20/manager/0.log" Apr 16 04:55:05.994036 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:05.994001 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/1.log" Apr 16 04:55:06.069783 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.069745 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-chgzn_7cfe1750-0b4e-43ab-b858-92eb84a5bd2a/cluster-monitoring-operator/0.log" Apr 16 04:55:06.097526 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.097498 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-gf878_845d2afa-de51-4c2f-95f1-6416de335031/kube-state-metrics/0.log" Apr 16 04:55:06.129809 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.129785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-gf878_845d2afa-de51-4c2f-95f1-6416de335031/kube-rbac-proxy-main/0.log" Apr 16 04:55:06.156726 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.156698 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-gf878_845d2afa-de51-4c2f-95f1-6416de335031/kube-rbac-proxy-self/0.log" Apr 16 04:55:06.206549 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.206497 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-458wb_674ddf7c-c87b-4a2d-905f-ce7f5730ae71/monitoring-plugin/0.log" Apr 16 04:55:06.398021 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.397939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rgpmw_2c02d360-9b4f-44d2-802b-44aa2d3ca611/node-exporter/0.log" Apr 16 04:55:06.420286 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.420261 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rgpmw_2c02d360-9b4f-44d2-802b-44aa2d3ca611/kube-rbac-proxy/0.log" Apr 16 04:55:06.440636 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.440612 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rgpmw_2c02d360-9b4f-44d2-802b-44aa2d3ca611/init-textfile/0.log" Apr 16 04:55:06.547091 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.547064 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/prometheus/0.log" Apr 16 04:55:06.568553 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.568518 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/config-reloader/0.log" Apr 16 04:55:06.594806 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.594775 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/thanos-sidecar/0.log" Apr 16 04:55:06.633416 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.633389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/kube-rbac-proxy-web/0.log" Apr 16 04:55:06.658991 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.658920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/kube-rbac-proxy/0.log" Apr 16 04:55:06.680518 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.680494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/kube-rbac-proxy-thanos/0.log" Apr 16 04:55:06.701692 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.701665 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b55b096a-6b44-4567-8071-2f7bb51e4c6a/init-config-reloader/0.log" Apr 16 04:55:06.726961 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.726932 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-9pphv_ed3956a3-f976-4384-814d-557aff30f00d/prometheus-operator/0.log" Apr 16 04:55:06.746126 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.746098 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-9pphv_ed3956a3-f976-4384-814d-557aff30f00d/kube-rbac-proxy/0.log" Apr 16 04:55:06.771403 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.771377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-kf7rc_928211b1-3493-4f55-82b3-0bd62a77bf41/prometheus-operator-admission-webhook/0.log" Apr 16 04:55:06.804418 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.804395 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d76444989-mlm9q_c906f038-e36e-466f-8ce1-62037b784089/telemeter-client/0.log" Apr 16 04:55:06.828678 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.828638 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d76444989-mlm9q_c906f038-e36e-466f-8ce1-62037b784089/reload/0.log" Apr 16 04:55:06.850322 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:06.850296 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d76444989-mlm9q_c906f038-e36e-466f-8ce1-62037b784089/kube-rbac-proxy/0.log" Apr 16 04:55:07.695150 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.695112 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq"] Apr 16 04:55:07.699343 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.699317 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.701890 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.701865 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4r9jh\"/\"default-dockercfg-4f7rr\"" Apr 16 04:55:07.703040 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.703016 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"kube-root-ca.crt\"" Apr 16 04:55:07.703163 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.703016 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4r9jh\"/\"openshift-service-ca.crt\"" Apr 16 04:55:07.704605 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.704582 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq"] Apr 16 04:55:07.775601 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.775564 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-sys\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.775601 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.775601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-podres\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.775864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.775632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-lib-modules\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.775864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.775747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgg6\" (UniqueName: \"kubernetes.io/projected/24de87e6-cd84-4801-b371-12b51dc54f43-kube-api-access-gbgg6\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.775864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.775791 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-proc\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-sys\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876648 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-podres\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-lib-modules\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876725 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgg6\" (UniqueName: \"kubernetes.io/projected/24de87e6-cd84-4801-b371-12b51dc54f43-kube-api-access-gbgg6\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876748 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-proc\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-sys\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876845 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-proc\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876853 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-podres\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.876886 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.876880 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24de87e6-cd84-4801-b371-12b51dc54f43-lib-modules\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.884488 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.884460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgg6\" (UniqueName: \"kubernetes.io/projected/24de87e6-cd84-4801-b371-12b51dc54f43-kube-api-access-gbgg6\") pod \"perf-node-gather-daemonset-82wxq\" (UID: \"24de87e6-cd84-4801-b371-12b51dc54f43\") " pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:07.951753 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:07.951680 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-ltqzz_cd8f751f-9b6f-4484-bebc-72c3ef2e887a/networking-console-plugin/0.log" Apr 16 04:55:08.011525 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.011471 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:08.146403 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.146330 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq"] Apr 16 04:55:08.149591 ip-10-0-133-81 kubenswrapper[2567]: W0416 04:55:08.149560 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod24de87e6_cd84_4801_b371_12b51dc54f43.slice/crio-7a62ef3f29229b2aac82bdc96dc78cf4bd7bb4de06df52c56fc609611092862a WatchSource:0}: Error finding container 7a62ef3f29229b2aac82bdc96dc78cf4bd7bb4de06df52c56fc609611092862a: Status 404 returned error can't find the container with id 7a62ef3f29229b2aac82bdc96dc78cf4bd7bb4de06df52c56fc609611092862a Apr 16 04:55:08.151434 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.151417 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 04:55:08.667912 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.667789 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" event={"ID":"24de87e6-cd84-4801-b371-12b51dc54f43","Type":"ContainerStarted","Data":"8a1d6b4f2a1ab3c392491243c9e2624b89d0eafc63be948150f5da68f3755da4"} Apr 16 04:55:08.667912 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.667861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" event={"ID":"24de87e6-cd84-4801-b371-12b51dc54f43","Type":"ContainerStarted","Data":"7a62ef3f29229b2aac82bdc96dc78cf4bd7bb4de06df52c56fc609611092862a"} Apr 16 04:55:08.667912 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.667882 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:08.683945 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:08.683896 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" podStartSLOduration=1.683856118 podStartE2EDuration="1.683856118s" podCreationTimestamp="2026-04-16 04:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 04:55:08.683199168 +0000 UTC m=+1864.668649230" watchObservedRunningTime="2026-04-16 04:55:08.683856118 +0000 UTC m=+1864.669306182" Apr 16 04:55:09.387738 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:09.387707 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-crmkh_82c18366-78a3-498d-b2b0-202d98470a14/volume-data-source-validator/0.log" Apr 16 04:55:10.165910 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:10.165880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m5jm4_390e05c5-2dbf-454b-872e-6a8969a124ae/dns/0.log" Apr 16 04:55:10.186876 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:10.186845 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m5jm4_390e05c5-2dbf-454b-872e-6a8969a124ae/kube-rbac-proxy/0.log" Apr 16 04:55:10.255575 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:10.255542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7pjb8_c35154e3-77e2-4f41-b5e8-99905ca385f9/dns-node-resolver/0.log" Apr 16 04:55:10.842805 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:10.842773 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tns7g_f41140f0-dc31-4907-aae1-f8d108bb517f/node-ca/0.log" Apr 16 04:55:11.727010 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:11.726974 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fzjjz_aae2b43c-b3a7-419a-8da3-5fd4e1646843/discovery/0.log" Apr 16 04:55:11.745799 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:11.745775 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-68d6cc647-m6stc_a9d9ae6f-6579-4770-98c2-5083dadcb564/kube-auth-proxy/0.log" Apr 16 04:55:11.827840 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:11.827794 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-tp7ww_b0f6588a-939a-44db-935c-149b41f7b7ad/istio-proxy/0.log" Apr 16 04:55:12.345733 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.345698 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6jqxs_c19b5e4b-3455-4e1c-b332-ba6c51fb153b/serve-healthcheck-canary/0.log" Apr 16 04:55:12.809642 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.809592 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-zwp94_65296c0d-211b-4c4b-8926-070aad0da721/insights-operator/0.log" Apr 16 04:55:12.810515 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.810497 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-zwp94_65296c0d-211b-4c4b-8926-070aad0da721/insights-operator/1.log" Apr 16 04:55:12.830127 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.830099 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kmvb_4e36f31c-8145-464a-af78-558a0e3d7c33/kube-rbac-proxy/0.log" Apr 16 04:55:12.850749 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.850724 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kmvb_4e36f31c-8145-464a-af78-558a0e3d7c33/exporter/0.log" Apr 16 04:55:12.871043 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:12.871014 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8kmvb_4e36f31c-8145-464a-af78-558a0e3d7c33/extractor/0.log" Apr 16 04:55:14.689996 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:14.689961 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4r9jh/perf-node-gather-daemonset-82wxq" Apr 16 04:55:14.879726 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:14.879692 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bs8tv_c21c3371-b668-4261-a94f-0d5af069b04c/manager/0.log" Apr 16 04:55:14.991587 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:14.991537 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7bxmc_a850ebad-0876-49e7-a7cf-5b05d5b500ec/manager/0.log" Apr 16 04:55:15.003444 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:15.003418 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7bxmc_a850ebad-0876-49e7-a7cf-5b05d5b500ec/manager/1.log" Apr 16 04:55:15.062410 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:15.062386 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-c7946b447-pjqx2_2d4adf34-12c0-4e76-b7e2-e6c39c7e7355/manager/0.log" Apr 16 04:55:15.112512 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:15.112483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mk8cn_9c5f9d9d-ff46-4279-8750-f7eda58806b6/postgres/0.log" Apr 16 04:55:16.453249 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:16.453222 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-bs8d5_5493e96d-031c-4086-835b-81957ad31956/openshift-lws-operator/0.log" Apr 16 04:55:20.725956 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:20.725924 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-dzsx5_25544907-fe6f-4ea8-bb7d-71f79b123309/migrator/0.log" Apr 16 04:55:20.747229 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:20.747203 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-dzsx5_25544907-fe6f-4ea8-bb7d-71f79b123309/graceful-termination/0.log" Apr 16 04:55:22.072386 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.072360 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/kube-multus-additional-cni-plugins/0.log" Apr 16 04:55:22.098052 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.098025 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/egress-router-binary-copy/0.log" Apr 16 04:55:22.119737 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.119714 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/cni-plugins/0.log" Apr 16 04:55:22.141421 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.141394 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/bond-cni-plugin/0.log" Apr 16 04:55:22.160966 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.160941 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/routeoverride-cni/0.log" Apr 16 04:55:22.181430 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.181403 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/whereabouts-cni-bincopy/0.log" Apr 16 04:55:22.201032 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.201013 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wjzz_84db4e3c-a849-4173-b21b-fbb75fd25be3/whereabouts-cni/0.log" Apr 16 04:55:22.567177 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.567146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4qkb_c70b5e71-9ba3-4891-851c-653635c97ffb/kube-multus/0.log" Apr 16 04:55:22.697330 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.697305 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j6hlh_638d6e19-46c9-4d63-a7b2-461e842da022/network-metrics-daemon/0.log" Apr 16 04:55:22.716241 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:22.716212 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j6hlh_638d6e19-46c9-4d63-a7b2-461e842da022/kube-rbac-proxy/0.log" Apr 16 04:55:24.080447 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.080406 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-controller/0.log" Apr 16 04:55:24.099393 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.099365 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/0.log" Apr 16 04:55:24.107864 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.107816 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovn-acl-logging/1.log" Apr 16 04:55:24.125831 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.125809 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/kube-rbac-proxy-node/0.log" Apr 16 04:55:24.146261 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.146224 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 04:55:24.166388 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.166366 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/northd/0.log" Apr 16 04:55:24.186413 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.186389 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/nbdb/0.log" Apr 16 04:55:24.208439 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.208412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/sbdb/0.log" Apr 16 04:55:24.327906 ip-10-0-133-81 kubenswrapper[2567]: I0416 04:55:24.327871 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s79zz_fbe3cd6d-315c-4d44-81a3-217be3d98348/ovnkube-controller/0.log"